Unix: History and Memoir

good afternoon everyone it’s a real pleasure to be speaking with you today i do wish i could be with you in person in buenos aires but that’s obviously not possible so today i am in princeton new jersey which is about 75 kilometers southwest of new york city so if i can play this there it goes so there i am in princeton new jersey not too far from new york city and then to bring it a little closer to where you are i’m about 8 500 kilometers north of buenos aires so i hope that wherever you are during these dangerous times you and your families are safe and well one of the reasons that we can actually have a conference like this where the people involved are spread all over the world is that we have the internet to connect us this would not be possible without the internet of course and much of the software that runs the internet and runs our own computers and for example runs applications like zoom which we’re using today all of that is based on the unix and linux operating system so the unix system was actually created just over 50 years ago at bell labs in murray hill new jersey and it’s not far from where i live so that x as you can see not too far from new york or from princeton is where bell labs at murray hill is located so what i’m going to do today is to talk mostly about the early history of unix sort of how it came about how it grew in the first 20 years or so and i also want to talk some about what we can learn about from the the history and the development the evolution of the unix operating system and maybe about how to manage creative processes with lots of very capable people so that’s where we’re going um i need to give you a little bit of history perhaps um to set the context to the stage for this so in 1885 a very long time ago a company was formed called american telephone and telegraph which was called att and it was basically created by combining together a bunch of small telephone companies in various parts of the world and parts of the united states that is and it grew and soon came to be the company that provided telephone service for almost everybody in the united states it became fairly obvious to them quite quickly that they needed to be very expert in scientific and engineering fields to be able to provide telephone service for the whole country and so in 1925 they created a subsidiary called telephone laboratories with the goal of doing research in the kinds of things that would make telephone system work better so that company bell telephone labs or bell labs originally was in new york city just about where i was showing there but in about 1960 it was moved to suburban new jersey in the location that you see marked there it was quite a large operation this is an aerial view of the bell labs operation in 1961 this is a place where the fundamental research in things like physics chemistry materials and so on was performed there are at least three thousand people working there over one thousand of them had phds in things like physics and chemistry and this was the home of a very large number of important inventions perhaps the most important is the transistor but also the laser fiber optics and probably eight to ten nobel prizes for work that was done by people here at bell labs now this is bellat’s research and there was effect around this not physically in the same place but scattered around um about 25 000 other people who are doing development converting the results of research into things that would make the telephone service work directly telephone product telephone services so they were part of bell labs organizationally but they were not part of bell labs research and they were not here at murray hill and this was still a small part of the overall company at this point att had over one million employees so it was by far the largest employer in the country so most of the research at bell labs at this time was in

physical sciences but there was also a fair amount of work in mathematics and particularly the mathematics of communication systems and there were a couple of very significant pieces of work that were done there for example claude shannon invented information theory while he was at bell labs in the late 1940s and dick hamming invented error correcting codes also at the bell labs in the late 40s and early 50s and interestingly clutch hand and dick hamming shared an office that must have been an absolutely amazing place to be so in about the middle of the early early to mid 1960s the mathematics and communications research split and spun off a small group of people who did research in computer science and there were maybe 25 people who were doing computer science research they worked on programming languages they worked on operating systems and a variety of other things that were sort of the computing end of mathematics and most of them were actually located in one small geographical area towards the top of that picture there you can see two red dots the red dot that is downward and to the right is the office where i was when i came there as a summer intern in 1967 and by coincidence the person in the office next to me was dick hamming so this was kind of an interesting way to start my career at bell labs the second red dot upwards and to the left from there is where i spent another 30 years as i worked at bell labs and the unix operation in fact the computing science operation was basically in those two side wings the corridor between them and up underneath the roof on the sixth floor so that’s the physical location if you’d like how does this lead to unix well there were a variety of people working on things related to software and in particular some of them were involved in a specific operating system in 1961 at mit this man fernando curbato created a time sharing system called ctss the compatible time sharing system and this was done at a time when most computing involved punch cards handed over a counter to people who operated very large computers ctss was a computer that you interacted with remotely from a typewriter-like device and it was an absolutely wonderful system to use people loved it and so kirby and his friends decided that they would build the next version of such a system which was going to be called multix it was basically going to be an information utility it was sort of like cloud computing but before the cloud existed um so mit enlisted two other organizations to help with this one was general electric which at the time made computers and the other was bell labs which at the time was really expert in building operating systems and so there were probably half a dozen people in that computing science research group at bell labs in murray hill who are working on multix remotely and three of those people are the pictures on the bottom here ken thompson dennis ritchie and doug mcelroy so multix started in 1965 had a lot of very very good ideas but it turned out that it was in many ways too complicated too expensive too much too soon and it became clear that it wasn’t going to provide what it promised to provide at least in a reasonable time period and so early in 1969 bell labs withdrew from this group and that left ken and dennis and doug with nothing to do they had gotten used to a really really nice computing environment combination of ctss and multix but now they had nothing and so they started to think about how would you build your own operating system something like multics what could you do that would be a nice operating system for people to use and so they spent a lot of time trying to design these systems a system of some sort and but that was a paper design entirely and somewhere during that period ken thompson found a machine that wasn’t being used in which he could do some experiments and so this was a machine called a pdp-7 it was made by dec digital equipment corporation it was originally it arrived in 1965 and so by 1969 it was already

obsolete and it wasn’t a very powerful machine anyway but ken used it to do some experiments with file systems sort of just how do you manage information coming from a disk going to a disk and at some point he had this famous remark he said i realized that i was three weeks from an operating system and this is such a story that that unix was created in three weeks by one person that i actually asked ken about it somewhat later on after we had both retired from bell labs and this is the mail that i got from him in 2000 early in 2003. it basically says that yes unix was first a file system implementation to just see how much disk io you could do but it was hard to get information to and from the disc and so he needed something more and by coincidence his wife bonnie and their son were going off to visit ken’s parents in california and so at that point ken decided that he was almost there with an operating system and so in three weeks he wrote a shell an editor an assembler and that was the birth of unix you notice it was pretty small machine 8k 18-bit words so call it 16k bytes so half of that was devoted to the operating system kernel and the other half was the user space so that’s pretty impressive software productivity from nothing to an operating system in three weeks the however really was a pretty small and limited machine already obsolete and so what could you do to scale up so ken and dennis and others spent quite a bit of time trying to convince their management to spend a lot of money to buy a real powerful computer they wanted half a million dollars to buy a deck pdb 10 or something of that sort but management had been badly bitten by the multics experience and was extremely reluctant dragged their feet and never ever approved the purchase of this expensive machine however about that time in early 1970 dec announced the pdp 11 which was a much more modern machine had some very nice properties and one of those properties was that it only cost fifty thousand dollars not five hundred thousand dollars and so money was found to buy a pdp 11. and so this is a famous picture of ken thompson and dennis ritchie uh sitting ken is sitting dennis is standing by that pdp 11 20. i think this picture is roughly 1971. um and you can see a number of things about the computing environment at that time in particular you can see that the input and output devices were model 33 teletypes so can is typing at one of these things it’s like a very uncomfortable electric typewriter and it prints on rolled paper you can just see the roll of paper slightly to the above where his hands are and also on the other one to the right so this was the machine that the first real unix appeared on um in roughly 1971 and in fact the first edition of unix that is the first thing that was thought to be a coherent hole ready to go in use and completely self-supporting was in late 1971 recognizably unix if you started to use it today you would sort of feel familiar it supported of course multiple simultaneous independent users it was a real time sharing system in that sense it had a file system and all the rest of it and one of the most intriguing things about it was that it was well documented um that was the system that introduced the famous man page or manual page style that we see in unix even to this day so this is the manual page for the cat command these dates are written in american style so that’s november 3rd 1971 and it shows the man page for cat and includes even down towards the bottom a bugs section which was something that you would never see in systems provided by computer manufacturers but which has been a standard part of unix man pages for a long time so this is the first thing that you would call really really unix that was used by a variety of people um and that started the tradition of turning out a new version or a new edition of the system roughly every six to eight months for quite a while the second edition came out in june of the next year and there was a market forecast there that said that the number of unix installations had grown to 10 so those were being used inside bell labs for a variety of experimental work the system had not gotten to the outside world at that point but by may of 1975 the sixth edition had everything that you are familiar with in modern unix systems almost every respect and it was the

first edition of the system that was actually used widely outside of bell labs as well as insight but it had first a hierarchical file system which had been there from the beginning and in fact had been inherited from multics uh the idea that files were just bytes that the any structure in a file was imposed by the programs that interpret it not by the operating system and this idea that devices like the terminal or the disk drive the raw disk drive or other things are just files in the file system so that you can manipulate them with the same system calls the same reads and writes that you would use to manipulate files in the file system there was a programmable shell it wasn’t particularly easy to program but it was programmable regular expressions which is a characteristic of unix systems were already there in lots of different programs um pipes which i’m going to come back to in a few moments had been introduced a year or two earlier and there were a variety of interesting tools and languages including tools that made it easier to develop more languages and something i was really interested in document preparation tools and the pro the system had been written in c since the third edition so at that point a couple of years and it was only about 9000 lines of code so quite an impressive piece of work in fact um it was documented in this book or pair of books by john lyons john lyons was a professor from the university of new south wales in australia he spent a couple of sabbatical years at bell labs one of them across the hall from me and he wrote this wonderful commentary which basically listed in one book the source code for the operating system nine thousand lines of code and the other book so that you could read them sort of side by side was explanation of what that code did at the very very detailed level so if you want to learn about how an operating system works this is actually a very very effective book and you can find it on the web at this point so something else i wanted to talk about because it showed up in by the sixth edition it was actually the oil before the sixth edition was the notion of pipes so for many years doug mcelroy pictured here again um had had this idea that you ought to be able to connect programs together and his metaphor for this was that you could take one program like a section of garden hose and screw it together with another section of garden hose and that way information would flow through a sequence of programs to get something done so doug had this idea in 1964 and that’s a very sort of not very clear copy of a paper that he wrote at that point but not much happened on that idea and i think partly because doug’s idea was a much more general connection of programs not just a linear sequence but potentially any arbitrary graph of programs but somewhere um along the way ken thompson figured out how to actually make linear connections of programs really easy and so he added a system call to unix the pipe system called it made it possible to connect the output of one process to the input of another process and the notation for that there was a brief period only a couple of weeks i think where there was a bad notation and then somebody came up with this vertical bar notation which is so natural and people started to do things so the first example there says run the who command that tells you who’s logged on to the system at this point and produces one line per logged on user graph is a pattern searching program that just looks for patterns in this case it’s looking for the pattern joe and we’ll talk about crap more in a moment so the output of who list of users goes through a filter that selects the names the lines that contain the word joe and then that goes into a program that counts the number of lines word count and so that way you find out how many times this joe locked in now that by itself is probably not a very interesting example but it’s typical of the sort of thing that pipes make so possible that you can do have these very ad hoc just on spur-of-the-moment connection of programs to get something done that would be hard if you had to go off and write a particular program of your own and so what when this pipe mechanism came along i still remember this vividly people went into a frenzy of inventing things and of upgrading programs so that they would work in pipelines so think about the sort command for example the sort command reads a set of text lines sorts them into alphabetical order prints them out again so it obviously can’t produce any output until it’s seen all its input so it can’t fit in a pipeline except as a matter of packaging you want it to fit in a pipeline and so sort was modified so that it would read from its standard input and when it had finished the sorting it would write to its standard output and that made it possible to use sort in pipelines

as well it didn’t it was no longer a special case and so the example the third example there is a pipeline that originated with steve johnson a call it quick and dirty uh sequence of commands that would make it possible to potentially detect spelling mistakes so you would concatenate a bunch of files and you’d run them through a command that would basically convert them into one word per line in lowercase throwaway punctuation and then you would sort that so that point you’d have a sorted word list then you’d run it through a program that would throw away the duplicates so you only had one instance of each word and then you would run it through a program that would compare it to a dictionary and print out the words that were in the original document but not in the dictionary and those were potential spelling mistakes so this was in fact the first draft of the spelling mistake program it was surprisingly effective and then lots of other programs use this idea of you can connect programs and so you can modularize big tasks you can make it so that each program does one piece of something and passes on its its data to another program to do another part of the job so pipes in some way are the most characteristic thing about unix is seen by its ordinary users it’s something that’s incredibly powerful very easy to teach to people and leads to all kinds of interesting program design and modularization choices so what else well the c programming language which was created by dennis ritchie in roughly 1972 or 73 was an integral part of unix at this point um the baltics project had the idea of writing all of the software for the system in a high-level language which is an excellent idea and they were one of the first to try that the language they chose unfortunately was pl1 which turned out to be very much too large and complicated too hard to compile too hard to program and so for a while multics used a language called bcpl which was created by martin richards at the university of cambridge kent thompson took bcpl and stripped it down to make a very very simplified version of it which he called b and then dennis took b and basically added data types to it to make c so c is the language for system programming you replaced things like pl1 and vcpl and it was sufficiently expressive and efficient that you could do pretty much anything with it so it was intended for system programming things like writing compilers or assemblers or text editors or the like but it turned out that it was sufficiently expressive and efficient to write the operating system itself in c so what happened was an explosion of system programming applications like the ones i’ve listed there but also a c compiler that was portable this was done by steve johnson and that made it possible to port the operating system to other computers as well so the unix system began life on a pdp-11 but with the portable c compiler if you could write a compiler for another kind of architecture then you could move unix and all its programs and tools and so on comparatively easy to that new machine and that was a very very important aspect that portability was done for example to enter data of 832 i believe by dennis ritchie and steve johnson in about 1976 and a variety of other people did portability experiments as well i think the portability of unix the z programming language and the operating system itself and all of those tools enabled the workstation marketplace that started with companies like sun microsystems mips and sgi and a variety of others and even the ibm pc so the ibm pc arrived in 1981 and although the first one wasn’t very powerful it soon became quite a reasonable machine and it turns out that there were unix systems that ran on the ibm pc and one of the most surprising of these was a system called xenix which was sold by microsoft and one doesn’t think of microsoft as a unix company but they were for quite a while one of in fact the largest distributor of unix systems in the form of this system called xenix and one wonders what the world would have been like today if microsoft had followed through on the unix stuff instead of going off and working with dos and then the windows i think the world would have been interesting and certainly different in a lot of ways

okay so what else about unix one of the things that’s particularly interesting about unix is the idea of tools the idea that you can write programs that you can then combine and use in various ways as basically tools to help you build other things i’ve mentioned some of the small tools like grep and sort wc diff for comparing text files i mentioned the programmable shell so that you can create complicated sequences of operations using these small tools and package them up and run them as a single shell command there are also tools for making languages so that you can make more interesting languages for specific areas steve johnson did the yak compiler compiler which is still around of course mike lesk did the lexical lexical analyzer generator which is still around of course and stu feldman made make which made it possible to automate the complicated sequences that were necessary sometimes to compile uh interesting programs and some of the programs that came out of that were then programmable tools or scripting languages in particular awk which alejo and peter weinberger and i worked on in the late 70s and then other languages like c plus plus which started with pioneer strawstrip in about 1980. so lots of examples of tools and in fact the picture here which shows building blocks is meant to imply that unix was really good for building things out of little blocks of programs so i want to talk for a moment about grep because grep is in some ways the quintessential small tool it’s the thing that captures the idea of tools perhaps the best and i think it’s the place where we first started to think that what we were doing was in fact more building tools than just writing programs for things so it was created basically by ken thompson almost overnight by taking the text editor which had pattern searching in it and just throwing away the part that did file storage my first i don’t know what i would describe an interesting experience with grip has a story behind it and i want to tell the story uh briefly so i one day was called at bell labs by somebody who told me that he had just gotten an hp 35 pocket calculator now that dates the story it must be from about 1972 or 73 because that’s when the hb 35 calculator came along but what he told me was that he had discovered that when he held his calculator upside down some of the numbers made letters and he said i you guys have a dictionary on your computer don’t you uh is there some way that you could use the dictionary on your computer to tell me what words i can make when i hold my calculator upside down well this was an interesting idea of course i was in a research operation and so it was really rewarding to be able to help somebody who had a nice practical problem so um that was his question and i asked him what letters can you make do you think on your calculator and he said the letters will be b e h i l o s so here it is from one of my calculators it’s not very pleasant but okay those are the letters you could make and so i said to him okay and i typed this command so graph is the pattern searching program we’ve been talking about userdict web 2 is the word list from webster’s second international dictionary of english so it’s a very large word list it’s 235 000 words but it’s just words it’s one word per line and then the thing in the middle is a regular expression and in fact it specifies a pattern that is any arbitrary combination of any of those letters repeated any number of times but nothing else so this will pick out of those 235 000 words the ones that could be made in principle on a calculator if you held it upside down so anyway i typed a command and out came an astonishing list of words um i’m a fairly well-educated native speaker of english and there’s a lot of words there i’ve never seen in my life i have no idea what they mean so anyway i sent this list off to him i think he must have been happy because i never heard from him again um but he left me with this wonderful story a beautiful example of for example the role of special purpose languages like regular expressions so regular expressions are a language they’re not programming language but they are a language in which you can express a certain effect um the other thing is a wonderful example of how you can use a tool to do a job if your tool is at hand you can

get something done that would not be worth doing if you had to write a special program for it so what we have at this point i think was you think about the c programming language the pipes the regular expressions all these nice tools language development stuff i think arguably this was in some sense a golden age this is the time when everything was really perfect that unix was everywhere it ran on every kind of computer c and its descendants particularly c plus ran everywhere on everything and all of these things were very widely used there was a very very large unix community with user groups all over the world so it was a really a wonderful time and people were recognized for the good work perhaps the most obvious thing is that ken and dennis won the acm turing award in 1983 for the combination of eunuchs and c and there was even a certain amount of public awareness outside the technical community of unix so here’s an example of public recognition this is ken thompson and dennis ritchie receiving the national medal of technology in 1988 from president bill clinton and so that’s one form of recognition here’s another one perhaps you will remember this movie uh jurassic park which appeared in 1993 most of the special effects for jurassic park were actually done on unix systems they were done on sgi workstations that ran unix and one of the interesting this is of course a velociraptor attacking the people how do you put those velociraptors back in their cages and this leads to one of those famous lines that every unix person over a certain age knows all about this this is a young lex played by ariana richards saying this is a unix system i know this and so by typing the appropriate commands she manages to put the velociraptors back in their cages and everybody is saved and lives happily ever after so this is a high point of geekiness in this movie that was the golden age but i think you could see that in the mid 1980s to the early 1990s a lot of things were causing this to be not quite so good there were a lot of commercial pressures all of these workstations that people built although they were very effective also the property that there were divergent versions of unix and so each manufacturer tended to add their own bells and whistles and make the system somewhat better for them and not so good for other people so there was a loss of standardization there were contributions from a very wide range of people but some were not as good as others because the combination of cheaper hardware and fewer constraints on things like memory and speed made it possible for more programmers with less experience and perhaps less taste to build things microsoft in particular on the ibm pc and the move of people to that environment also caused a certain amount of change i guess because windows in particular had a very strong focus on user interface graphical user interfaces that was and consumer level uh programming that was not part of the unix world and so a lot of very good people and a lot of good ideas and talent went in the direction of user interfaces for microsoft so that was kind of negative effects but at the same time there were some wonderful positive effects one was the development of linux starting in 1991 when linus torvalds began his uh system and the other was the rise of open source which richard stallman started in the mid 80s and which became again a major force so technical legacy what did unix do for us well there’s a variety of things that i think belong as the legacy of this system the first three that i’ve listed there the hierarchical file system the use of high-level languages and a programmable shell are all ideas that came from multics multics had an enormous number of really good ideas and many of those influenced unix quite strongly pipes original with unix the idea of programmable tools small tools lots of them programmable sometimes is more a standard unix thing regular expressions are an integral part of unix you see them all over the place in tools the shell and so on and one of the things that’s perhaps not noticed as much but was very much a reason for the success of unix is the idea that almost all information is simply stored as ordinary text that there’s a relatively few things that have very formatted text it tends to be just flat text files and so if you combine that with the pipes the tools the

regular expressions those all work together to make it possible to build things very very effectively that communicate with each other because they’re sharing a common data representation something else that i think unix does particularly well and it became a conscious thing at some point it’s the idea that programs can create other programs now we’re used to that with things like compilers a compiler takes a program and converts it into assembly language or something like that but there are lots of other examples of this in unix that for example yak which takes a grammar specification and converts it into a c program that’s a parser for that language or lex which does the same thing for tokens so lots of examples of those sorts of things and that’s again been pushed harder in unix i think than in other environments and of course portability itself is a very very major contribution the idea that once you wrote a compiler for a new machine in c you could take all of the unix system and its tools and drop it onto a new computer with comparatively little effort and then on top of all this there was a sort of philosophy of how to create software the idea that you have smaller programs that they do one thing well that you use tools that you think of yourself as in many ways building tools not just programs a sort of notion of rapid prototyping things to see how they will work before you invest a lot in it so that’s sort of what i think the technical legacy of unix is and that was a lot of that in fact was in place in the 1970s and so you could argue as doug mcelroy is sort of doing here that there’s not much happened after that and if you think about it in some ways what are the main things that are different since say the seventh edition of unix which was 1979 um networking is an obvious thing um the shift of gravity and unix development went to berkeley in many ways and the networking code there derives very strongly from what bill joy did with the socket interface on unix systems at berkeley graphical interfaces a lot derives from the work on x windows that was done at mit multiprocessors of course that’s a hardware improvement which is at this point where all the action is in lots of things uh unicode the the recognition that most parts of the world do not use ascii characters that uh but have lots of other things that they would like to use as well and of course a very large number of languages and and other things and we scale up we’re dealing with very very much larger everything at this point larger user populations larger machines and much more software but the basic ideas i think you could argue are sort of where in place for most of that back in the 1970s so question how did unix happen is it something that you could reproduce could you get something like that again so i think nobody can claim planning on this i think unix was in fact an accidental combination of some factors the first and the most important of course is to have good people two exceptionally good people ken thompson debus ritchie who had very good taste sort of an eye for what is the simplest cleanest most general mechanism that will get something done and solve a bunch of problems with one piece of code and then write that code well so they were just unbelievably good but there was a very strong group of people around them and the other things the management structure at bell labs was encouraged this kind of very good people and certainly at the bottom level uh the mountain management was technically exceptionally competent but was also very good managerially and i’d single out doug mcelroy for that i think unix might not have happened with a doug’s managerial guidance as well as his pure technical expertise and of course lots of computing environments of the time road the moore’s law uh growth as hardware got smaller and cheaper and faster at an exponential rate for many many years and so that helps for a lot of things and then finally the working environment bell labs was an astonishingly good place to work and i can tell you that from having spent 30 years there it really really was an excellent place to work the funding was stable the way that bill labs was funded was that a very small tax was applied to any time you made a phone call in the united states a tiny slice of the money involved would go to uh bell labs to improve future telephone service and so that helped that meant that you could count on having revenue to support research for a long time the organization was very stable people stayed there for many many years and the company itself took an

exceptionally broad and long-term view the job was to improve communication systems and that’s going to be a problem for a long time so almost anything you wanted to work on was arguably relevant to uh building better telephone systems and so that meant the company was didn’t it was not run by the quarter of a year it was run by multi-year period and all of this led to an environment which is also very cooperative and just plain fun people enjoyed uh being there so could it happen again well i don’t think for operating systems i will bet there won’t be another unix-like operating system that takes over from unix but i don’t know but there’s lots of other things um that could have a sort of similar effect perhaps because they’re always good people i mean there really we are an astonishing number of good people around um it’s fashionable to say that management is incompetent in bunglers you think of gilbert’s pointy-headed boss but in fact there are good managers around and all of us know them hardware is cheap at this point hardware is really really cheap you can get an amazing amount of computing power for next to no cost and finally the open source movement has led to the existence of the enormous amount of really excellent software that doesn’t cost anything and so that gives people ways to build things that they might not have had in the past um it’s true that the kind of unfettered do whatever you want uh research environment that bell labs had in the 60s and 70s is probably not as easy to find today it tends to be smaller industrial research if it exists tall as smaller and more much more focused on short-term kinds of things and of course academic research where curiosity-driven unfettered research is supposed to happen often doesn’t have enough money and so that’s a problem too but i think it could happen again for things if you think about the list of things in your lives that you use technically that came from one or two people who had a good idea and pursued it linux is the obvious thing perhaps in operating systems that was one person um doing something and of course many many people involved now but that’s one person the programming languages that we use almost all of those are programming languages that were first done by one or two people c plus plus and java perl and python php and ruby and javascript it just goes on so so languages is a place where definitely you can have an effect if you like and of course some of the big companies that we deal with today were started by one or two people who had a bright idea as well microsoft and apple back in the day google and amazon and facebook and so on more recently so there’s lots of ways that you could do things like that where something not unix an operating system but something with a comparable effect on the world i think there’s still plenty of opportunities for that sort of thing to happen so i want to close with one uh observation from dennis ritchie about unix the observation is basically the unix was meant to be to provide a community sort of a welcoming comfortable place where people could develop programs software and so what it did was to bring people together then you know they enjoyed their work we really did at the original unix group and lots of places were like that you enjoyed the people you worked with and that led to doing good things so i think what we should be doing today is to try and preserve that sense of community that we’re trying to actually work together to make the world a better place in many ways and this conference is absolutely great example of that we’ve got thousands of people coming together to learn from each other and so that’s good so this has been a great fun for me i’ve been enjoyed being part of it i hope it’s going to be fun for you as well thank you very much for your attention in english uh john brian can you hear me yes yep okay so first of all welcome and thank you so so much for joining this event and this will be kind of like a conversation between the two of us i have believe me i have plenty of questions here but maybe john you want to get started well actually i met previously with

brian and we talked a little bit about what to say and one of the things i wanted i wanted him to get across to people was what was it like to work at bell laboratories now he did cover some of that in his talk um unless he was to elaborate on that at all i’m sorry about the flashing and stuff of my cameras here i’m having a little bit of problems but um one thing i did want to say and one of the things that very impressed me about unix was the elegance of the code and the elegance of the design particularly back in those days uh brian i know that your degree is in physics not in computer science and i normally tell people that we weren’t doing computer science back then we were doing computer black magic so i think the the elegance part of unix is very nice and see and john i think i’m losing you so let ah okay so let me say a little bit about that first let me say that it is an absolute pleasure to be with you all not quite with physically unfortunately but one of the very very very few benefits of the current pandemic situation is that it’s possible to actually be with people in a sense through zoom and so i hope you’re all having a good time it’s certainly going to be fun for me so john raised the question of elegance um in the unix operating system and i think in some ways that’s a nice way to characterize it that in many respects it was so cleanly neatly done that elegant is certainly the right word to describe it in english and i think the same thing that you could could be applied to the c programming language some of this i think came about because the machines that unix was written for in 50 years ago were very very very weak by today’s standards they had very very limited amount of memory they had they were not very fast their instruction repertoires were not very rich and so that enforced a certain amount of discipline on what you were doing anyway you could not get big complicated things that was not feasible but the other thing is that it was quite possible that the the thing that i think really made it uh elegant and so on is the good taste and good judgment of the people who actually designed and wrote the unix operating system and that’s ken thompson and dennis ritchie both of them had a taste for very simple clean uniform mechanisms the the phrase in english is sometimes minimal means what can you get done with as little as possible of uh mechanisms of one sort or another and so both ken and dennis thought that way they worked extremely well together and i think as a result of the system at least in those early days was something where um elegant is a good word to apply to it well i want to give the time to the people because it’s really their conference but one thing i want to say about that is it’s always impressed me the number of interfaces and functionality which is in the kernel itself versus the huge number of programming interfaces which are in the libraries outside of the kernel and in the shell and in all the commands the functionality is there instead of being built into a huge gigantic and probably unstable kernel right so i think there’s two aspects to it first the kernel itself was not actually very big and that speaks to what you just said that that extra functionality would then tend to be written as libraries that used what was in the kernel but the other thing is that the colonel itself had found suitable generalities it’s back to this elegance question found generalities that made it possible to get a lot of bang for the buck a lot of effect for a relatively small amount of code i think the cleanest example of that is perhaps the uh interface to the file system where there’s really only four or five functions system calls that you need to get things done there’s open and close and read and write and there’s not much more than that and so with that and that that applies to all files uniformly and it also applies them to things like devices that were attached to the computer at the time and so that meant that the kernel itself could be smaller and the interface to it the api could be smaller without in any way limiting what you could build on top of it so now i’d like to turn to the

moderators and ask them if they have questions from the audience that they would like to translate and brian and perhaps maybe i could help answer absolutely um is is that okay john do you want to take the questions or should i translate them uh okay i have to find the questions there’s a whole bunch of them and most of them are in in spanish i think yes so probably be best if you read them and translated them if you could okay good so i will just read them in ordering of in the order of voting so that they’re voting the questions the first one is actually uh in english most voted uh nicolas is asking could you tell us a memorable story about dennis ritchie that’s what he’s asking ah yes thank you nicholas that’s an interesting absolutely question and and you’ve of course put me on the spot uh to come up with the good one uh and so on i think actually what can i say um one memorable story uh about dennis um and it speaks to his i don’t just basic generosity as a person uh bill plogger and i were writing a book the book that was called software tools in that we were trying to create programs that were written in rat 4 which was a dialect of fortran that sort of looked like c and so we wanted to uh include illustrations of tools that programmers might find useful one of those would be a macro processor this is something that we don’t use a lot now but think of something that would be rather like the c preprocessor but a little more general purpose and a little more systematic not tied to c so much and so i wrote a a terrible macro processor it was just appallingly bad because i wasn’t a very good programmer i don’t think i really understood what i was doing and what i produced was so terrible that could not possibly have worked in the book so i explained this problem to dennis one day and said gee you know i have this macro processor it’s terrible i i don’t know what to do and he went away and he came back probably within a day and he gave me a macro processor is the m4 macro processor which is still alive today still used um he wrote it in c obviously and so i got to translate it into rat for but he did all the work and he never wanted any of the credit for it in any way whatsoever so there we have in some ways dennis encapsulated a person of incredible talent ability to write useful code very very short order really good code and giving it away without the slightest interest in getting any kind of credit for it so there if you’d like a memorable story at least to me about a really memorable person i have a similar story uh it was a usenix conference and we had a reception downstairs and there were lots and lots of people standing around and talking with each other and stuff and i looked over and there was dennis ritchie kind of standing by himself with a glass of white wine in his hand and nobody was talking to him and i recognize the syndrome where people are so in awe of somebody like dennis ritchie that they don’t don’t want to go up and talk to them and that’s terrible you know and i it probably you have experienced the same thing but dennis was just the most open and friendly and nice person i ever one of the nicest day ever man yup absolutely all right uh okay let’s uh let’s go to the next question and again john feel free you as well to uh to expand so next question is from joaquin he’s asking what do you think it was the biggest mistake you make during unix design did you solve it oh thank you for waking um i have an absolutely perfect answer for this uh excellent question which is i had nothing to do with the design of unix so there are no errors in it that are my fault and so i wouldn’t change anything um awesome i could give you a perhaps a different set of answers that um since the the people who made these uh decisions are not around right now to defend themselves um one i think uh going back to dennis ritchie and the design of the c programming language i think there’s

always been a lot of question about whether the declaration syntax especially for complicated chains of pointers to pointers to pointers is the right design because it’s actually sort of unintuitive in a lot of cases and i think dennis felt that it was absolutely fine in one sense that it matched that is the usage of something in a piece of code match the declaration but you had to sort of read it in a funny order and so is that something that might have been different hard to say but that’s one another place where i think again dennis might have thought differently subsequently but it was too late to change would be the remarkably high number of levels of precedence in the c grammar that they’re just a lot of operators with lots of levels of precedence and some of those precedence choices are also a little counter-intuitive in some sense that there are things that seem like they’re the wrong level so those are things that perhaps in the c programming language might have been done differently and perhaps even the authors would think so ken thompson was once asked that same question by the way at the question of what would you do differently if you were doing unix over again and he said i will spell create with an e [Laughter] um okay if you don’t do you think i don’t know if you have any thoughts on this john well one thing i always questioned was the decision to make the int function match the quote natural word size of the machine and as you tried to move it from machine to machine with different word sizes sometimes you would get stuck and from the very beginning forcing or allowing the programmer to force a word size even if it was going to be less efficient i think would have been a good thing to do with the language all right uh okay i’ll move to the next question uh this one is from yuba pamper he’s in israel so he’s asking if you were uh oh if you were starting to develop unix today which programming language will have been will have you chosen for the task please don’t say javascript uh no not a prayer i would use awk no sorry um let uh that’s an excellent kind of question and i think and today the answer is maybe not totally obvious in some sense i was once quoted a long time ago and i think accurately is saying that if i was going to be marooned on a desert island with only one programming language i would like it to be c and i think arguably that’s still the case probably because i can make anything else with c and the opposite is not always true so i can create any old thing i want with c and i will know that i have complete control over what’s going on i will not need any significant perhaps no runtime support whatsoever and that it will run efficiently on pretty much any kind of computer so in that sense i think c still makes a lot of sense as the language for an operating system the problem is that c doesn’t scale as nicely to very very large code bases as uh other languages do it doesn’t make it as easy to create for example abstract data types of one sort or another it doesn’t make it easy to build firewalls between different parts of a program and all of those things are important and increasingly so as the size programs get bigger so if you’re building uh as the original unix systems where thousands of lines of code it’s not as big a problem as if they’re let’s say today’s linux systems where they’re tens of millions of lines of code it’s just a totally different game so if i were going to do it today uh i don’t know one possibility and this will perhaps be surprising but one might think of c plus not the full blown everything you know because it’s a very very large language but find that subset of c plus which gives you control over what’s going on in your program that is doesn’t let you build firewalls between things gives you some notational convenience and still takes advantage of the things that see had where you can put things exactly where you need to in the memory and where you can access all the capabilities of the underlying hardware so that might be a possibility another language which is often suggested these days and i’ve seen stories on it within the last few days is whether rust might be a suitable thing for building an operating system i simply don’t know enough about rust to have an informed opinion on that my last time i tried to write any rust was some years ago the language was at that point

a bit of a moving target in the documentation was absolutely a moving target and i just got frustrated so i put it aside for a while it’s something i’d like to come back to because they’re clearly good ideas in that and so that would be a possibility the other one that i might think about is go although it requires a certain amount of runtime support as well and it has a lot of a lot of advantages including the cultural familiarity that sort of makes it look very much like c but a modern version so those would be possibilities for me i’d just like to add one comment and it’s about awk for you i love awk i really do and i did some amazing things with awk and for years i thought that awk was named because it did awkward things gracefully and then i found out that the three authors were aho weinberger and kernan so thank you very much for your share of hawk i appreciate it there’s another there’s another dennis ritchie story there of while we’re on the topic um awk um is of course sounds the same as the name of a bird called the awk a uk in english at least and dennis was fond of finding usage bird stories that in some way or other related to ox and he would send them to us that’s a really good one okay uh i have more questions here guys so many questions uh so uh i have one here from fernando he’s asking the unix philosophy was do one thing and do it and do it well in the last years there was a movement to build some huge commands that do a lot of things like system d what are your thoughts on that move yeah i certainly the the unix philosophy which perhaps we backed into accidentally but that idea of doing one thing that each program would do one thing it would do it well and then that there were ways to compose programs in particular the pipeline idea which was i think unique to unix at that point i think that worked really well for all kinds of things um it worked very nicely when the things you were building were not huge it worked very nicely when the data that you were passing around could take a uniform format which is basically text and in fact ascii text so for all of those things i think that philosophy actually worked very well there were places where it perhaps didn’t scale but in some ways it was forced upon us as well because again the early machines were very very limited i mean at the height of sixth edition unix which would be 1975 or so we were running on pdp-11s that had perhaps 64k byte of memory and that’s a lot so it encourages smallish programs rather than very big programs early unix systems also do paging they just did swapping and so that was yet another sort of hardware imposed constraint on the way that you would build programs now i’ve never used systemd i have no concept of that whatsoever but there are plenty of other tools around that are in effect their own sub-systems in a way get would be an example perhaps where there is one outside command let’s call it get and then there are dozens or hundreds or for all i know thousands of sub commands within that um and so there is one program that does a whole bunch of things and within it are then pieces that presumably do one thing for each piece and perhaps do it well another one that’s like that is image magic which does a lot of graphics transformations and that’s one program where you say convert and then it has a lot a lot of different uh sub things within it that do their particular task and so in effect you’ve got a two level uh main command sub command and do things and i think for some areas that works rather well and but it for others it’s not entirely clear that that’s the right thing so i i still am comfortable with the unix philosophy of smaller programs that do one thing and then i can glue them together in a variety of different ways they were perhaps not thought of by whoever wrote them originally i think it also depends what you mean by do one thing well i mean t-roth as an example is a single command of the line but has this whole set of commands underneath of it that influence what it does uh and arc itself is another one like that so i mean awk is certainly not a simple command and uh you know so i think it really

depends on the other hand i think os has gotten completely out of hand at least os and the gnu in the gnu syndrome there yes i think um perhaps i’m overstating it a bit but if i understand it correctly the ls command has approximately 62 different options one for each uppercase letter each lowercase letter and each digit and i may be under counting for all i know [Laughter] all right okay next question this one is from jorge from canada um was there some idea that you tried to build and it was not possible with the technology of that time and i think this is a follow up on what you were talking before brian yeah i’m not sure that’s an interesting question of things that we tried to build that were not possible with the technology of the time i think that for the most part it was pretty obvious that you could only get so far because you only had a certain amount of memory at your disposal um i think the limitations may have in some ways been more hardware i think of one experiment that i tried to do a very very early experiment in speech recognition something that today we take for granted with systems like siri or the amazon alexa or whatever but back in probably the late 70s we had a device that you could actually speak to and it would translate give you text of course what you said and i thought boy wouldn’t it be interesting to have an interactive version of something like let’s say the shell so that i could say ls and it would list my directory um but the the hardware just wasn’t really up to the job and so what at the time seemed like a nice interesting idea kind of got stuck on that um and of course it would have been hard to do that for a very you know sort of free speech as we would expect to have siri or alexa work on today so that’s a place where i think the hardware limitations were pretty visible in some sense i think if you take a look it’s not well it’s it’s a combination it keeps going over itself we keep building larger things because the hardware allows us to and there’s also the concept of can you do it or can you afford to do it can you afford to buy the hardware i mean when i started programming transistors were still 1.50 a piece and now i can get 10 billion of them for 10 so it’s you know if you remember digital cameras in the beginning were very poor resolution and now the resolution of even a cell phone is so great you know film is only for the people that are really really needed so i think we’re going to have continuing stuff to come out problems that we’re going to be able to solve simply because the club the hardware gets bigger faster less expensive and the operating system and the code reaches up to it right i think one of the overall limitations i think in many systems is what do you have to do and what do you have to know and what do you have to have to be able to do quickie experiments to see as john is suggesting to to see whether something even makes sense or not and in the early days of unix because things were so small and because you know basically you had to write your own code but there wasn’t a lot that you could do it made it possible to do quickie experiments like this idea of well j well let me just see if i can have a voice controlled shell and had that hardware available to me i find that stuff today actually in some ways to be harder because the systems that we’re dealing with are so bloody enormous and there are so many libraries and things that it’s hard to say i just want to do a small experiment what do i do and i have to know too much there are environments where this actually works plausibly well for example python has an enormous collection of useful libraries and so if i have a half-baked idea or something i might be able to find a python library that already gives me a leg up on that that would not be as true for example in the sea world where the library ecosystem isn’t quite as i don’t integrated in a way so i’d love to have more chances to do that kind of experimenting where it was really easy to get off the ground and see whether an idea made sense or not right yeah okay next question uh from juan ignacio this is a pretty long one so

stick with me he asked what do you think it is the modern equivalent of bell labs i mean if uh is there any place today where engineers scientists technicians and all sorts of other people are able to mingle and are giving great freedom to experiment and create awesome things yeah that’s a good question it really is um bell labs at the time was i think unique in the sense that it was very large that is probably when i was at murray hill in the late 1960s through to the 90s there were just in that one building roughly 3 000 people scientists doing work uh probably at least a third of them had phds in various fields from physics and chemistry and mathematics and computer science later and and things like that and a very strong support system so that was part of it there were an enormous number of well-trained uh and very good people around um the other thing that was unusual about bell labs certainly for from its inception and and and the time it was started in 1925 uh certainly from the time i got there in the 1960s through the 1980s or so um it was an environment in which people were basically told to go do interesting things but nobody said what the interesting things were so you got a chance to work on things that you thought might be some combination of interesting and relevant and the emphasis was on the interesting not on the relevant part and so there were people who spent their entire careers doing things that were interesting academic work that got published in academic conferences and journals and things like that and there were others who did very pragmatic things that had to do with a basic job of company which was improving telephone service and then there were people like me who flipped back and forth between these things and everybody kind of took advantage of that sort of environment it worked because the funding at bell labs was very stable and uniform basically it amounted to attacks on telephone calls in the country so that whenever anybody made a telephone call a little tiny slice of the money involved in that was peeled off and given to bell labs with the requirement basically that over the long haul bell labs would develop things that would improve telephone service in the country and so that meant that funding was stable it meant that they could take a very long term view of whether something was worth working on or not management could take a long view on assessing it and because it was big you could afford to have a lot of noise in what individuals were doing because the average the aggregate of it would work out better there were other operations at the time that did good things as well the obvious one is xerox park the palo alto research center which produced an enormous number of really really important things a lot of the things we take for granted today like window systems and laser printers and all that kind of stuff came from xerox park um they too had a corporate master xerox and but xerox as the company that that made printers and copiers i think didn’t really know how to take advantage of their research lab um and also park was a lot smaller um and then ibm had a very large and very effective research lab all of those things kind of went away i think arguably over a period of time in the 90s and the 2000s and so on and so to get to a roundabout answer to the question there’s certainly no equivalent to bell labs today and it probably never would be because there were too many things that were kind of unique to the time and the economic system and so on there are certainly places that do really interesting research it’s hard for them to do the kind of unfettered do whatever you want for years kind of of stuff that bell labs had um because they’re a part of companies that are supposed to make a profit for their shareholders or whatever and but there are companies that are big enough and successful enough have enough money that they can afford to do some of that as well i mean google i think does a lot of really interesting work especially in artificial intelligence but machine learning but lots of other things microsoft does a lot of really good work as well uh so they’re not quite the same i think arguably over a period of time they will also have uh make have made and will make real contributions uh to the world so uh not the same but in some ways similar i saw the same type of research done at dec you know we were the second largest computer company and we had pure research labs although they tended

to be more oriented towards computer and networking and graphics and things like that but they were pure research and then after that you had the advanced development engineers and then after that you had the manufacturing engineers and the sustainability engineers and each one had a different thing that they were doing but some of the other things i saw at bell labs i worked there at a branch labs for a number of years that were interesting was the library system where every week you would get this thing emailed to you as a list of of documents and white papers and things like that with a little abstract you would mark down the ones you were interested in you would send that back and the next week there would be all these white papers for you to look through and read and understand and stuff like that and it was just automatic there wasn’t a charge to your center or anything like that it was it was wonderful we also had a thing called one year on campus where if you went and you got a bachelor’s degree at some university and you were hired by bell laboratories they say okay work here for the summer and now go off and get your master’s degree at one of these prestigious universities and we’re going to pay your tuition and give you 1200 a a month stipend to live on and as a student back in those days 1200 i mean oh my that was oh amazing i mean that was a huge amount of money and there was no requirement that you come back to work at bell labs after you got your master’s degree and some people didn’t yeah right i always considered those people stupid because if you once came back to bell labs and worked there for even three years on your resume it was great and you know but some people didn’t you know right but you realize john you’re dating yourself and me by saying 1200 was a lot of money but well yeah but i mean everybody knows how old i am okay but but the the last thing i want to talk about was the licensing of ideas where bell labs would i don’t know if it was required by law or you know or required in the whole concept but the transistor was licensed out to people at an amazingly small amount of money and all the a lot of the other things that bell labs was freely licensed out it may have been a small charge but it’s ridiculously small and that helped to make bell labs you know useful to a whole series of different companies and things and we have in the united states these things called national labs sandia you know los alamos you know all these different national labs that unfortunately have not been getting the type of funding that they should be getting to make pure science that much more you know available to people and that’s sad yeah i think you’re right i think the licensing stuff is interesting in part i think it was required because att was a regulated public monopoly that meant that it it was strongly controlled by both federal and state governments and there were limits on how much money it could charge and what kind of services had to provide and so on and in particular it wasn’t allowed to make money off certain things that they might have created because that might be cross-subsidizing and therefore in effect artificially raising prices or preventing competitors or something like that so i think the transistor was an example of something that was licensed because of that concern certainly unix was licensed that way because at the time unix was developed in the let’s call it 1970s att was not allowed to sell software and therefore what they did was they had to basically give unix away to universities as you say for essentially a nuisance fee just the media and they sold it commercially in very small quantities for again just a nominal amount of money and i think it was because again they were regulated in that way i think one of the comparisons and and this goes back to the early question i think one of the comparisons with the national labs in the united states one of the advantages that at t had that is the parent company of bell labs was that there was no shortage of really important interesting problems to work on and so you could be curious about all kinds of things and there would be some real problem that actually was a place where your curiosity would match that real problem and i think it’s harder perhaps in a setting like a government laboratory as you like sandia well cindy was focused on weapons so that doesn’t count but some of the other uh national labs and also universities sometimes you have to make up your problems and that’s not as effective as having somebody who really really cares about the answer to something that you can talk to true um okay guys we have plenty of questions

here uh one of the and and please these are the most voted questions so i am completely unbiased so one of one of the most voted questions here says why do strings seem to be so complicated in c is there any story behind that i so i unfortunately i don’t really know although i have a conjecture that strings are not complicated in c they’re just different from um there’s more than one way you could represent strings and so the the c representation which is just a sequence of bytes with a terminating character at the end has some real advantages uh you can manage it and you don’t have an extra data structure associated with a string a string is just a pointer and so um that has advantages and then of course it has the myriad disadvantages that we have all discovered when we try to write code that uses strings and you run out of space you run off the end of the string the string wasn’t terminated oh how long is it it’s a linear algorithm to find out how long it is just all kinds of weirdnesses that go into that but i conjecture that originally it was because that was the simplest thing that you could do that worked really well and remember the first c compiler fit in something like 2k bytes of memory so you don’t have a lot of room to fool around with complicated data structures and complicated would be just having a length field associated with it and of course if you have a length field how big is the length field you can’t make it too big because then you’re using precious memory and you don’t have any memory so so i think it was probably a carefully considered engineering decision that uh for many things was great and for other things it’s kind of a pain okay uh this this is one of the most voted questions but again i guess the audience want to know wants to know what os system to use these days yes that’s the question well i am sitting in in front of a macbook air which is of course running uh one of apple’s whatever it is i don’t know it’s not the most mojave probably or something like that um and i use it almost exclusively as if it were a unix machine i don’t use very many of the fancy things that come with it and very often i use it exactly as a terminal to connect to a real linux system which is run by the computer science department at princeton and that’s fundamentally what my environment is and the nice thing about os x os mac os whatever they’re calling it today is that it looks just like unix at the level that i use it and i can write code that works on both that and on the linux system and so that yeah that’s my computing environment i don’t know about you john i only use linux uh or gnu linux to make richard stallman happier um i refuse to tell people what distribution of canoe linux i use because uh when people ask me that what distribution do you use what they’re really asking is what distribution should i use right and i never know enough about them to tell them what it is and so i tell them hey you can just download it you can try it out if you like it you keep it if you don’t you know throw it away and download another one but i do not use microsoft i do not use apple i use an android phone um i’ve never used i’ve never used microsoft windows um i’ve actually installed microsoft windows i did that for my mother and father because they’re in a situation where i couldn’t be their systems administrator from 900 miles away and they lived in a retirement home and there was a lot of people around who knew microsoft however over time those people mysteriously started using linux so eventually my parents were able to get their support from somebody local and i could install linux on their system all right uh we have the next question from francisco he’s asking what concept what concepts uh do you wish you had known at the time of starting to develop unix even if they were not invented at that time okay man sorry chase you guys ask embarrassing questions um so but i have a great defense for that one too um i didn’t develop unix okay i had nothing to do with it i know i was a camp follower i was surrounded fortunately by some astonishingly good people um i think that some of the things that probably came along

not long after that might have been worth thinking about early there’s one is networking the original unix machines were obviously self-contained they didn’t talk to anything except that their direct peripheral devices and so networking maybe that’s something which if it had been thought of and dealt with in some clean way a little earlier would have worked out better i simply don’t know on on that one so so i would say networking is one the other thing that i think was probably an issue all along was the issue of memory management um in you know basically virtual memory because that was effectively a research topic when unix was developed and then as the combination of hardware improvements and software understanding came along uh it became much less of a research problem than just an engineering problem at this point is presumably a totally solved engineering problem how do you how do you manage the memory that you have to make it look like it’s infinite i’m chuckling a little bit because in one of the first conferences on linux that was held at the u6 association one of the linux developers stood up and gave a very great talk about malek and the different types of malek and the different and the efficiencies of each malec and stuff and at the end of the talk a friend of mine stood up in the back of the room and said great talk the thing is we did all of their studies 20 years ago came up with exactly the same results except we actually measured the results [Laughter] so a lot of times we reinvent things over and over again and sometimes it’s worth looking back to see what people have done before you and that’s why having the papers and reading the papers and stuff like that is good i would i would say one of the things that would have been nice to have in the from the very beginning is more of the concept of a thread model inside the curdle i mean fork is okay and stuff like that but to be able to have a lighter weight thing in the concept of a thread from the very beginning might have been nice okay even with a single even with a single core processor threads are useful good uh we have another question this on this community is all about things going wrong so another theme question uh esequel is asking what is the weirdest weirdest bug that you found in the initial version of unix do you recall how it was fixed yeah i fall back on my usual defense i don’t recall um an unusual bug there let me give you an example of an unusual bug that is contemporaneous with eunuchs um at that time but would did not appear on unix although in principle it could have i suppose um this is one that happened to steve johnson who you may know as the creator of the yak parser generator and as with dennis uh doing the uh port of unix to the inner data as the creator of the portable c compiler and so on so steve was in the same group but at that time we were not all on the unix system we were also using a very large machine that made by general electric of all people called the ge635 and so it was a big machine it was sort of a follow-on to the ibm 70-90 series or it was like a stripped-down version of the multix g645 it’s a big clunky machine 36-bit words and things like that but it had two processors um and that was for the time this would think of this as the very early 1970s unusual i think and so steve johnson had a program and he would run the program and half the time that he ran it it would say some it would print the right answer it would say 1.0 or something like that and the other half of the time that you ran it it would say something like 0.372 or some random completely wrong answer and couldn’t figure it out just you’d run it and sort of randomly would get the right answer or a completely random wrong answer and nobody could figure it out and it finally turned out that it was a hardware problem that the floating point unit in one of the two processors was broken and so if you’ve got if your job happened to run on the broken processor you’ve got a random wrong answer and if it was on the other one you got a right answer so that was a hardware bug hard to find presumably once found presumably easy to fix would there be similar things in unix absolutely would there be any computer is going to have that kind of thing some combination of phenomena hardware software operator error

all kinds of things that give you basically random results and not reproducible bugs and those are probably the worst of all kind to find because you can’t reproduce them you can’t control you can’t make them happen when you want to and so it’s very difficult to fix them i have a very interesting question here from nicolas he’s asking when did you realize uh well sorry when did you really realize that you changed the world with your work thanks nicholas i i don’t have a date on that i can remember a couple of random things that suggest that because we always thought you know unix is this thing we use internally we don’t think too much of it in for the first few years and then excuse me it was a system that was used by a variety of other people who are very much like us they were people who are uh other industrial operations or largely at universities so we’re just like us and and so obviously they would know all about it but the outside world would know nothing about it and then at some point i was looking through the new york times and this this dates the story and obviously in some ways it was at a time when the new york times still printed classified advertisements advertisements in the paper and there was an advertisement from somebody some organization who wanted a unix programmer and i remember clipping this out and taking it to ken and dennis and saying we’ve arrived i don’t know when the date was i would guess probably in the early 1980s or something like that so that’s one and i guess the other thing there have been a couple of examples of places where on the popular show tv show in the us at least called jeopardy where the clue had to do either with unix or the c programming language and i got one of those from dennis at one point you still have it awesome for me for me it was about 1981 or so i was working at bell labs and i went down to berry hill new jersey we had a meeting of a whole series of unix systems administrators from 18t bell labs and we were sitting in just a large room in a big circle so that’s the number of people in in all of bell labs in western electric who were involved with being unix systems administrators and at the end of the meeting we were just sitting around talking and stuff and somebody said well i bet that unix is going to be the last operating system ever written and everybody yeah yeah yeah we think so we think so but then i said no i think there will be other operating systems after that but i’ll bet they’ll be called eunuchs and i was only off by a few letters we still have the x there okay awesome uh many people are also voting and asking uh so the question about brian if you can give uh some comments or do you have some thoughts about plan nine yeah so yeah i i i do not have informed plan thoughts uh about plan nine in a way um what plan nine was was an attempt to take all the good ideas from unix and then do them even better in some sense to take an idea that worked for something and try to generalize it further so one uh one good example is in in unix as we talked about a little while ago devices are in the file system and so you can manipulate devices with the same uh system calls and the same functions and so on as you can manipulate uh regular files on the file system so that’s a nice idea that’s generalizes from files on a disk to peripheral devices and so that was an example of something where we could generalize that even further in the plan 9 world and so all kinds of things became file systems or hierarchies of directories and files so for example the windows on your screen each window would be a file and there would be a hierarchy of them to capture all of the windows shell environment variables would be a file system where the directory was the i’m the environment and then each file within it would be a shell variable and so on so there were all kinds of ways in which you could do that processes network connections all the things would then fall under this single unifying picture of um they’re all the same thing you can manipulate them with the same uh i o calls and things like that and so that was that was typical of uh plan nine let’s

take a good idea and see if we can push it extend it further and i think that was a very positive kind of thing the flip side of it is that i think it went too far in some ways and it was sort of procrustean in the sense that things had to be exactly the right size and so the uh for example the standard io library that we had all grown up with uh in the sea world was deemed to be kind of too irregular and not very clean and so on and so a complete new version of that was written with different syntax and different somatics and so that meant that you couldn’t import or export programs in the plan 9 world without redoing the i o for them and so that made it a lot harder for the good ideas from plan nine to find their way outside and the good ideas were outside to find their way back into plan nine so that was something where that sort of we will do it over again and we’ll do it better but incompatibly was a problem another example was make a command that we all use to create you know just you know do compilations for us that wasn’t good enough so there was a new one called mook mk and it had different syntax and different semantics and it certainly cleaned up a lot of the problems with the original make but it meant that you had to rewrite this stuff if you wanted to move a program from the unix world into the plan nine world or vice versa and i think the combination of those things was kind of a problem and uh certainly speaking only for myself i’ve at some point said it’s too much of a problem for me because i you know i people use my code on unix and i don’t want to be bothered rewriting it for the plan 9 world there were people who did heroic work to try and take things from the one world and build tools bridges things that would get you from one to the other however tricky did versions of the standard io library for example in a variety of other tools but for the most part it became too much two separate worlds and it was too hard to establish the new operating system on the other hand plan 9 did some really good things and perhaps the most useful thing that it did legacy that will outlive it for a long time is the creation of utf-8 the external representation of unicode that was done by ken thompson dennis ritchie sorry misspoke ken thompson and rob pike um they created that representation and that was used throughout plan nine and plan nine used unicode throughout internally it was probably the 16-bit characters but externally it was utf-8 and that’s become the way that we pass information around the internet today and so that’s a plan 9 legacy which is astonishingly important i’ll agree with brian that there were certain things that came from plan nine but when i first really got involved or i didn’t get involved with it but i first looked at it at a year six conference there was ken and dennis and i said what compatibility have you between all these other systems unix because you have to look at developers you have to look at companies they’ve already gone from proprietary operating systems like vms and mvs and and os2 and might and and they’ve gone to unix with the promise that their programs would be portable and to say to them throw it all away because we have the right answer in plan nine i said without the compatibility it’s just not going to get off the ground okay uh okay another uh enough enough question this one but we have about five minutes so i think we can have a little bit of fun so bruno is asking do you guys like like video games or did you like some of at the time can you name a few of your liking [Laughter] oh bruno what an interesting question um so i reminded about something which i think is said by saint paul in one of his letters to somebody when i became a man i put away childish things but passing over that yeah i actually uh the only video game that i think i ever got really really into was a version of space war which happened to run on probably that same kind of early pdp seven that unix was done on it was one of these things where you you and a friend would sort of orbit the sun and then you try to shoot each other while not falling into the sun and that was great fun i i have no idea how many hours i wasted on that um but i think at that point that’s the last time i ever really got into video games of any sort i for whatever reason they didn’t really appeal to me and so i wasn’t part of it uh at that point and i realized that another example of a generation gap or something like that but uh sorry i’ll let john speak for himself

i um i played adventure on an asr 33 teletype where you would go down into these twisty turny little uh caves and stuff and then when you got down sufficiently far you would say xyzy and pop back out um so and and i played that because i was also interested in state driven games and that was a very small game fit a very small space just by going from state to state it was you know simple to understand i also like space wars i also played that on a uh a link eight computer with a little round oscilloscope screen uh but i only played it about three or four times and i’m kind of with you brian i have more interesting things to do it’s okay if other people play games i just have too much stuff to do to waste to use up all the time yeah no but i still remember xyzy as a password for something although i didn’t play much of adventure and it’s a maze of twisty little passages all alike which is a useful phrase for describing modern software but i i’m going to go back a little bit to a bug and something i saw ken thompson do and i think it’s something that people have heard about but i was at this remote bell labs up in andover massachusetts and we were trying to do uh numerical control on a numerical control machine we were doing this with a unix system feeding the american patrol machine and we were having problems in getting the the programs a download and ken happened to visit us one day we said hey we have this problem with unix we told him what the problem was he goes oh yeah i know what that is he went to the machine sat at the console broke into the machine got root access fixed the problem about as fast as i’m talking here and that was the famous trapdoor that he had built into unix systems where the compiler generated the trap door it was not seen in the source code and that’s why i tell everybody security is really hard you know that’s interesting because that’s the story behind his turing award lecture on trusting trust or something of that sort and and john’s right that’s it you don’t trust anyone um and that that uh paper that he wrote that is during the word talk uh is absolutely worth reading even today no question whatsoever perhaps especially today okay we have about three minutes left so let’s see if maybe this will be the last question this one is from javier i think this is more directed to you uh brian so he asked he asks can you share some anecdote while while you were writing the practice of programming with rob pike yeah um yeah i’m not sure there’s anything specific i think that um one thing that’s interesting about it and this is something which at the time was comparatively unusual perhaps and today would probably be taken for granted but a fair amount of that book was written while uh rob pike was actually in australia and he was working he was teaching at um either university of sydney or university of new south wales i actually don’t remember at this point and so he was in australia and um i was still at bell labs in new jersey and so instead of being able to walk two doors down the hall and go into his office i had to you know send mail and we had to flip things back and forth and so that undoubtedly slowed some things down um the other thing that i remember specifically from that um and this is one of these things that speaks to the caliber of programmer that that hung out in that area rob and i were writing the practice of programming and one of the chapters of the book was going to be about notation the importance of the notation that you have available to you to talk about what you want to do with your computer and the obvious piece of notation that shows up in unix everywhere is regular expressions okay because it’s just pervasive the shell wild cards regular expressions and grab all of those kinds of things and so it’s an obvious thing that we wanted to have in the book was a an example of regular expression processing so how do you get a regular expression processor well you could do the way the original grep was created which is to go to something that has the regular expression process in it and then just throw away the stuff you don’t want and use the part that you do but that was too big grab the the re part of grep was probably 500 lines of code at that point far too big for a book so rob and i talked about it for a while and then he went off to his office two doors down the hall and came back an hour later with a regular expression parser

that was the one that shows up in the book it’s about 30 lines of code it’s in c it’s an absolutely elegant uh you know recursive regular expression recognizer that recognizes a handful of special characters and meta characters like star in particular so it’s like a miniature version of grab 35 lines of really really really beautiful elegant code um and that was rob you know just going away and thinking about it for a bit and then writing the code so that’s the i don’t play in that league i’m sorry these folks are just entirely too good for me but it was nice to have them around okay uh okay gentlemen i think unfortunately this is all the time we have we would like to spend the entire day talking with you but unfortunately i think this is it so uh john brian thank you so much we are honored to have you in our event here in buenos aires it’s been absolutely wonderful ladies and gentlemen thank you thank you okay great hopefully someday this this will be over i can personally vouch that these people are really wonderful people they’re very warm and if you ever get a chance to go down to buenos aires and hang out at the sis army people please do i am looking forward to it thank you for the endorsement john [Laughter] have a great day okay muy bien this is you