Yes, the inventor of the mouse is dead: But Doug Engelbart was such an inventive person and that only the mouse is attributed to him is a gross understatement of his importance to modern computing and how we work with computers.
When he invented the mouse, computers were hardly interactive at all. Timesharing providers, todays ISPs, had begun to appear, but this was more for academic use and for computer geeks, of which there were few. What Engelbart envisioned when the mouse was invented was something completely different from computing as it was seen back then. Computing and computers not as a massive calculator or Rolodex, but as a tool to augment the human and thinking mans brain.
This wasn't just the mouse, the thinking went far beyond that. Doug Engelbart was the mastermind behind the Mother of all demos, see here on youtube. Note, this was in 1968. Truly interactive computers was still rocket science.
Mouse, Windows. you name it, Doug thought about it. Maybe his visions weren't exactly how we use computers today, but still pretty close, and the concepts sure were spot on.
Read more on Doug Engelbart in The New York Times for example. And before I close up this post, think about how difficult it was to invent the mouse. Not technically, and you can see what it looked like at the Computer History Museum in Mountain View. No the difficult lies in thinking up what it would be used for. In an environment were computers were running in batch, where a computer user was either someone who was a programmer or someone how got data from a programmer, when there were no GUI, no interactive text editors, no spreadsheets, no internet and, you youngsters listen carefully now, no Facebook. Scary, huh? At a time when a mini-computers, such as a PDP-8, which was just about to be introduced and was the size or a major appliance and less processing power than a similar major appliance of today. This was when Doug figured out we should be interactive with computers, and that was revolutionary only that, but he also set out to implement that interactive system and developed and invented the tools he need for this, such as the mouse, in the process.
Without Vanevar Bush and Doug Engelbart, thing would be very different, and they could see this happening way back in the 1950's, and in the case of Bush, even earlier. But whereas Vanevar Bush was a visionary, Engelbart was this, but also an inventor and implementer, ready to create the future he envisioned. An impressive guy for sure.
RIP Doug, thanx for everything and in particular for being such an inspiration.
/Karlsson
Thursday, July 4, 2013
Thursday, October 13, 2011
Dennis Ritchie, the creator of C, dies at 70..
When I first got in touch with C it was in the early 1980's. I was a sysadmin at a Swedish telco operator (then THE Swedish Telco operator, Televerket, nowadays called Telia) for a system used for software development for a PABX system called A345 in Sweden, better known as Meridian in the rest of the world (co-developed by Televrket and Nortel). The Meridian system was the biggest of the non-custom built PABXes in those days. The language used to program it was called SL-1 (Switching Language 1) and the development system, like editors (vi / ined), compilers etc was running on a Unix system.
This sure was one of the earliest commercial uses for Unix, the Unix variant was version 6 and was not a BSD or anything like that, this was way before BSD really. Rather, the system was built by Interactive Systems which was the first commercial Unix vendor. This was version Interactive 2.5, based on Unix version 6, mind you. sh and csh only, no bash. Lot's of hardcoded stuff in the shell etc. The hardware it was runniing on was a PDP-11/70.
I knew a bunch of programmiing languages already, like Pascal, Basic, APL (yes! APL), Fortran etc. Also, I had done my fair share of microcompter assembly (mostly on 6502 and some 6800 CPUs). As the system we were sysadmining was so Unix focused, everyone who got in touch with it had to know some C, so eventually I was sent on a C-class (not a C++-class that is ;-). This changed my life, to a large extent.
Having access to a reasonably high-level language like C, that was still so close to assuembly and hardware (did I tell you I was a hardware geek also?), was just what I wanted. Close to the hardware, but high-level enough so you could write reasonably sized systems with it without spending too much time on individual bits and bytes, except when you wanted / needed to.
Of the things I have learnt over the years in the IT-industry, like Windows, Linux, SQL, database, optimization, multi-threading, you name it, none of them has been even close to being as useful as those C-classes I took nearly 30 years ago.
Dennis Ritchie, the creator of C, really got C completely right. And thing is, C was very close to being perfect even at the first try. Look at "The C programming Language" book, this has to be one of the most concise book on programming I know of. And I still use this book. It's factual, at times funny and always complete. And this in a very small package, really amazing compared to the 650 page books on C# or something of today. They say C is so complex, so why does it take just over 100 pages to explain it? And 600 something to get started with a language that is supposedly "easier to learn"?
Want to know how to write solid code from the ground up? Learn C, even if you will eventually develop in some other laguage. Want to learn hos coomputers work? Learn C. Want to understand how Windows works? Learn C (this is how I learnt Windows some 20 years ago, using Windows 3.1 and 3.11. I was writing Windows apps in C (and this is how I still, to this day, develop Windows apps), and suddenly I knew how Windows worked, without even trying).
Too bad that Dennis Ritchie is dead, but the legacy of the world greatest programming language, C, lives on. C has been used to develop applications on all kinds of levels, from end-user applications to databases and infrastructure to operationg systems. If C++ better that C? Probably so, but some features of C are gone in C++ and some things that C++ adds aren't so great. What about C# then? Same there. What about Java? Nope, it's a good language and a good environment, but I still prefer C, which is not to say that Java doesn't have it's merits. And then we come to things like Tcl/Tk, Perl, PHP, Ruby etc? Useful tools alright, but not really proper programming langauges in my mind, and not proviing the insights you get by understanding C.
Over the years, when I have been with a customer or used some program and something broke, knowing C meant I could much faster diagnose what was wrong and how to fix it. Pascal / Fortran / APL and all those languages never did that for me. C required an understanding of the environment that the code was to execute in, but once you had that understanding, things got A LOT easier, and you suddenly knew things that you didn't before, which weren't directly related to the task at hand, but you knew it.
Rest in peace Dennis, and thank you for bringing C and Unix to the world. I know that my life would have been very different without C and Unix.
/Karlsson
Born to code in C. Using vi (as emacs didn't fit in the memory of that PDP-11/70).
This sure was one of the earliest commercial uses for Unix, the Unix variant was version 6 and was not a BSD or anything like that, this was way before BSD really. Rather, the system was built by Interactive Systems which was the first commercial Unix vendor. This was version Interactive 2.5, based on Unix version 6, mind you. sh and csh only, no bash. Lot's of hardcoded stuff in the shell etc. The hardware it was runniing on was a PDP-11/70.
I knew a bunch of programmiing languages already, like Pascal, Basic, APL (yes! APL), Fortran etc. Also, I had done my fair share of microcompter assembly (mostly on 6502 and some 6800 CPUs). As the system we were sysadmining was so Unix focused, everyone who got in touch with it had to know some C, so eventually I was sent on a C-class (not a C++-class that is ;-). This changed my life, to a large extent.
Having access to a reasonably high-level language like C, that was still so close to assuembly and hardware (did I tell you I was a hardware geek also?), was just what I wanted. Close to the hardware, but high-level enough so you could write reasonably sized systems with it without spending too much time on individual bits and bytes, except when you wanted / needed to.
Of the things I have learnt over the years in the IT-industry, like Windows, Linux, SQL, database, optimization, multi-threading, you name it, none of them has been even close to being as useful as those C-classes I took nearly 30 years ago.
Dennis Ritchie, the creator of C, really got C completely right. And thing is, C was very close to being perfect even at the first try. Look at "The C programming Language" book, this has to be one of the most concise book on programming I know of. And I still use this book. It's factual, at times funny and always complete. And this in a very small package, really amazing compared to the 650 page books on C# or something of today. They say C is so complex, so why does it take just over 100 pages to explain it? And 600 something to get started with a language that is supposedly "easier to learn"?
Want to know how to write solid code from the ground up? Learn C, even if you will eventually develop in some other laguage. Want to learn hos coomputers work? Learn C. Want to understand how Windows works? Learn C (this is how I learnt Windows some 20 years ago, using Windows 3.1 and 3.11. I was writing Windows apps in C (and this is how I still, to this day, develop Windows apps), and suddenly I knew how Windows worked, without even trying).
Too bad that Dennis Ritchie is dead, but the legacy of the world greatest programming language, C, lives on. C has been used to develop applications on all kinds of levels, from end-user applications to databases and infrastructure to operationg systems. If C++ better that C? Probably so, but some features of C are gone in C++ and some things that C++ adds aren't so great. What about C# then? Same there. What about Java? Nope, it's a good language and a good environment, but I still prefer C, which is not to say that Java doesn't have it's merits. And then we come to things like Tcl/Tk, Perl, PHP, Ruby etc? Useful tools alright, but not really proper programming langauges in my mind, and not proviing the insights you get by understanding C.
Over the years, when I have been with a customer or used some program and something broke, knowing C meant I could much faster diagnose what was wrong and how to fix it. Pascal / Fortran / APL and all those languages never did that for me. C required an understanding of the environment that the code was to execute in, but once you had that understanding, things got A LOT easier, and you suddenly knew things that you didn't before, which weren't directly related to the task at hand, but you knew it.
Rest in peace Dennis, and thank you for bringing C and Unix to the world. I know that my life would have been very different without C and Unix.
/Karlsson
Born to code in C. Using vi (as emacs didn't fit in the memory of that PDP-11/70).
Monday, November 15, 2010
Remember ComputerLand?
It's soon time to celebrate 30 years of IBM PCs (They were first released in 1981, initially in the US and Canada). Before that, the microcomputer maket was much more diversified, but once IBM released the PC, most of what remained, after a few years, was the PC itself and Apple.
If you remember how it all started though, with the MITS Altair kits, you might also recall the biggest competitor to MITS, IMSAI. IMSAI, big as they were, died off by the early 1980's.
So then, what about the title of this blog? What the heck is ComputerLand? And amusement park? Nope, far from it. Actually, if you were in the US in 1981 and decided you wanted to buy a PC, you had few options of where to get one. Either one of the few Sears stores that had them, but those were very few, or ComputerLand. That was about it, for a while. And ComputerLand was big, real big! And they started growing even bigger, having something that was close to a monopoly on selling the PC was like a license to print your own money. And now they are all gone.
Well, there is a Franchise chain around called ComputerLand to this day, and they have been around a while, but they are not related to the original ComputerLand. And what about IMSAI? Well, Bill Millard, who founded IMSAI also founded ComputerLand. And as he was the person who screwed up IMSAI, he was also the person to screw up ComputerLand.
ComputerLand was an international Franchise chain of Computer stores, stocking just about every personal computer on the market initially. They were the first and went to become the biggest, but the whole operation eventually was killed of by internal battles (among the founders, among others, like Bruce Van Natta and Jack Killian who co-founded IMSAI, which in practice was drained from resources to found and build ComputerLand, of which they got from Millard: Nuthin')
The lesson is that don't think you can get away with just about anything, just because you did it once, history will eventually catch up with you. And although one may discuss of Millard was wrong, or if his intentions were wrong (I'm not so sure they were), but that he was weird and lost track of what was important from IMSAI and ComputerLand, that is pretty clear, in my mind.
By the way, do we still have hunger in the world? Or is World Hunger also History? One of Bill Millards projects that he was part of was to end World Hunger by year 2000. How? By raising consiousness! Oh, of course, why didn't I think of that. Nope, didn't work. But the intentions were good at least.
No, let's forget the old ComputerLand (they were sold and namechanged a few times, and finally went to sleep some 10 years ago or so) and remember the PC. The original clunky, 5 1/4 inch diskette thingy with 16K RAM. (yes, that was it in the original PC. 16K). At lease the PC survived and is with us today, although it didn't end world hunger. And we have something to run Linux and MySQL on! Thank you, IBM!
/Karlsson
If you remember how it all started though, with the MITS Altair kits, you might also recall the biggest competitor to MITS, IMSAI. IMSAI, big as they were, died off by the early 1980's.
So then, what about the title of this blog? What the heck is ComputerLand? And amusement park? Nope, far from it. Actually, if you were in the US in 1981 and decided you wanted to buy a PC, you had few options of where to get one. Either one of the few Sears stores that had them, but those were very few, or ComputerLand. That was about it, for a while. And ComputerLand was big, real big! And they started growing even bigger, having something that was close to a monopoly on selling the PC was like a license to print your own money. And now they are all gone.
Well, there is a Franchise chain around called ComputerLand to this day, and they have been around a while, but they are not related to the original ComputerLand. And what about IMSAI? Well, Bill Millard, who founded IMSAI also founded ComputerLand. And as he was the person who screwed up IMSAI, he was also the person to screw up ComputerLand.
ComputerLand was an international Franchise chain of Computer stores, stocking just about every personal computer on the market initially. They were the first and went to become the biggest, but the whole operation eventually was killed of by internal battles (among the founders, among others, like Bruce Van Natta and Jack Killian who co-founded IMSAI, which in practice was drained from resources to found and build ComputerLand, of which they got from Millard: Nuthin')
The lesson is that don't think you can get away with just about anything, just because you did it once, history will eventually catch up with you. And although one may discuss of Millard was wrong, or if his intentions were wrong (I'm not so sure they were), but that he was weird and lost track of what was important from IMSAI and ComputerLand, that is pretty clear, in my mind.
By the way, do we still have hunger in the world? Or is World Hunger also History? One of Bill Millards projects that he was part of was to end World Hunger by year 2000. How? By raising consiousness! Oh, of course, why didn't I think of that. Nope, didn't work. But the intentions were good at least.
No, let's forget the old ComputerLand (they were sold and namechanged a few times, and finally went to sleep some 10 years ago or so) and remember the PC. The original clunky, 5 1/4 inch diskette thingy with 16K RAM. (yes, that was it in the original PC. 16K). At lease the PC survived and is with us today, although it didn't end world hunger. And we have something to run Linux and MySQL on! Thank you, IBM!
/Karlsson
Thursday, April 15, 2010
A few notes from the History of Database Systems BoF
I will not write much on what went on at this BoF, but a few words are appropriate I guess.
/Karlsson
- We were some 10 to 15 people in the room at the end.
- I started the thing by talking about the ancient history of databases, and went on to talk a bit on the reasons of the Relational databases appeared on the scene.
- We talked more on these reasons. The old Network and Hierarchical were largely tape oriented, even when data was on disk.
- My theory on search being a very major factor for adopting the relational database technology in the 1970s and on seemed to be accepted.
- As I went on to discuss search in an RDBMS being contextual, and the need for non-contextual search caused quite a few debates.
- That non-contextual search will be a factor in moving to NoSQL, as is a theory of mine, was not accepted by anyone else but myself :-)
- Contextual and non-contextual serach means, and if this even is search and what these terms mean was discussed in gruesome detail.
- This brought on a NoSQL debate that lasted till the end of the BoF.
- That NoSQL is about performance was largely accepted.
- That key-value storage is a key behind performance was not (and I sure don't see it that way).
- The value of a Key-Value store in itself was discussed in detail. Are we just storing any kind unstructured data as a Value, or is it XML (which is most the case currently) or an Instance of an object.
- We alsodebated the storing on a set, as in an RDBMS, vs. an Instance, as a key-value store may be seen, was also debated. And if an RDBMS really is set-oriented and if a K-V store stores an instance was a hot topic.
- I think we eneded up with a notion that we will probably see a mix of RDBMS and K-V stores in the future, that they are complementary in the short to mid-term.
- In the long terms, I claimed that a K-V store will not persist as a generic solution, as it actually has less functionality than an RDBMS, whereas other claimed that a K-V store applied properly with instances and instance pointers ´within values is the way to go.
- Whatever happens, it will be interesting.
/Karlsson
Tuesday, April 13, 2010
BoF only special - See an incredibly ugly Oracle T-Shirt!
Yes, no kidding, I'll be wearing an old Oracle T-Shirt from my days at Big-O in the 1980's. I was, and your Oracle dudes who has been around for a while might remember these, an Oracle Unix Wizard. Actually I was a Wizard II (I went to the second training), but the T-Shirt I will be wearing is from Oracle Unix Wizards I.
Where will this take place you ask, as you just HAVE to come? Well, no further than my BoF tonite on the History on Databases. And I can tell you, this is not a T-Shirt that I would normally wear in public, but there is a lot of stuff I would do to attract a crowd to a BoF (just to see everyone running away in disgust). So at 7:00 PM tonite, tuesday, in Ballroom C (unless the location changes). Bring your good mood, ideas on the past and on the future, and above all, your barf-bags, to see what an ""Oracle Unix Wizard" looked like in the 1980's, although with a bit less gut..
/Karlsson
Where will this take place you ask, as you just HAVE to come? Well, no further than my BoF tonite on the History on Databases. And I can tell you, this is not a T-Shirt that I would normally wear in public, but there is a lot of stuff I would do to attract a crowd to a BoF (just to see everyone running away in disgust). So at 7:00 PM tonite, tuesday, in Ballroom C (unless the location changes). Bring your good mood, ideas on the past and on the future, and above all, your barf-bags, to see what an ""Oracle Unix Wizard" looked like in the 1980's, although with a bit less gut..
/Karlsson
Searching in databases - Maybe what will drive the next wave
In the old days, before SQL and Relational and all that, not when Vikings toured the world, drinking, being violent and causing mayhem, but still in the old days, the databases in use, the first reasonably generic database systems, were Hierarchical or Network based. These had a strict schema and data was extracted by navigation (i.e. Find company X, find orders for company X, find items etc.), and there was no way of searching data (which wasn't much of a problem, as data was largely stored on tape anyway, which isn't really searchable in the now common sense).
When SQL came around, the relations style schema allowed a much more free way of navigating data, and it also allowed searching. The SQL search as we know it is still contextual (i.e. you have to specify what to search, a SELECT from a customer table based on address, will not retrieve employees with a matching address). All the same, when SQL came around, the ability to search and the relatively free structure of data relationships would take database use to a new level.
But searching today is often compared to Google, and this kind of search is really non-contextual. This is an area where the NoSQL movement has an edge on SQL, mostly because of the largely schema-free nature of NoSQL implementations. If search was a main driving force towards SQL, will the same happen with NoSQL? Maybe, I'm not sure. What I AM sure of though, is that we need to develop SQL and the relational model to support more schema-free operations, mainly search, but I think there are other areas where this is relevant. And will this be the final nail in the coffin of true SQL systems? I'm sure it's not, we can enhance the functionality in the SQL-based RDBMS without wreaking havoc with relational algebra, somehow. But any SQL-based RDBMS that will stay around needs to have some support for data that is non-structured.
So why will a SQL based RDBMS with support for unstructured data and searching be better than a plain NoSQL implementation? In my mind, this will be the case as NOT ALL DATA is unstructured. Customer information, credit card payment data, product catalogs and stuff is distinctly structured, and a SQL based RDBMS enhanced to support non-structured data will potentially allow you to work with any kind of data, structured or non-structured.
So, having one piece of software handle different types of data, is that really a good idea? In my mind, it is, as the deal here is that even if this data is a mix of structured and unstructured, the different sets of data is still related, and it is relevant to combine operations of both of them, as one set of data.
/Karlsson
When SQL came around, the relations style schema allowed a much more free way of navigating data, and it also allowed searching. The SQL search as we know it is still contextual (i.e. you have to specify what to search, a SELECT from a customer table based on address, will not retrieve employees with a matching address). All the same, when SQL came around, the ability to search and the relatively free structure of data relationships would take database use to a new level.
But searching today is often compared to Google, and this kind of search is really non-contextual. This is an area where the NoSQL movement has an edge on SQL, mostly because of the largely schema-free nature of NoSQL implementations. If search was a main driving force towards SQL, will the same happen with NoSQL? Maybe, I'm not sure. What I AM sure of though, is that we need to develop SQL and the relational model to support more schema-free operations, mainly search, but I think there are other areas where this is relevant. And will this be the final nail in the coffin of true SQL systems? I'm sure it's not, we can enhance the functionality in the SQL-based RDBMS without wreaking havoc with relational algebra, somehow. But any SQL-based RDBMS that will stay around needs to have some support for data that is non-structured.
So why will a SQL based RDBMS with support for unstructured data and searching be better than a plain NoSQL implementation? In my mind, this will be the case as NOT ALL DATA is unstructured. Customer information, credit card payment data, product catalogs and stuff is distinctly structured, and a SQL based RDBMS enhanced to support non-structured data will potentially allow you to work with any kind of data, structured or non-structured.
So, having one piece of software handle different types of data, is that really a good idea? In my mind, it is, as the deal here is that even if this data is a mix of structured and unstructured, the different sets of data is still related, and it is relevant to combine operations of both of them, as one set of data.
/Karlsson
Thursday, April 8, 2010
While at the MySQL UC, pop by the Computer History Museum
If you are coming to the MySQL User Conference, you might want to pop by the Computer History Muesum. The CHM is in Mountain View, jus off the 101. If you have a car, just take the 101 and get off at Shoreline, it's just on the east side of the 101. If you don't have a car, you can get there anyway, from the UC take the light railway to Mountain View and then you can walk (some 20 minutes or so, not the nicest of walks, across the 101, but it's possible, I've done it) or take a bus from Mountain View.
At the CMH, among other cool things, is Babbage's Difference Engine in working order, a mechanical computer. That Babbagewas a smart dude is obvious from the fact that he never finished building the machine, although he designed it, and when now built using his original designs, it actually worked! I mean, the whole concept of designing the thing first, it truly weird, along the lines of code documentation that is actually correct and commented code that is actually helpful, two arcane ideas that I find very hard to grasp. The machine is demonstrated at the museum, and I think there is work in place to make it run Linux. (yes, that is a Joke, it's just barely powerful enough to run DOS).
The PDP-1 restoration project at the CHM is also interesting as is many other things there, so a visit to CHM is recommended.
/Karlsson
At the CMH, among other cool things, is Babbage's Difference Engine in working order, a mechanical computer. That Babbagewas a smart dude is obvious from the fact that he never finished building the machine, although he designed it, and when now built using his original designs, it actually worked! I mean, the whole concept of designing the thing first, it truly weird, along the lines of code documentation that is actually correct and commented code that is actually helpful, two arcane ideas that I find very hard to grasp. The machine is demonstrated at the museum, and I think there is work in place to make it run Linux. (yes, that is a Joke, it's just barely powerful enough to run DOS).
The PDP-1 restoration project at the CHM is also interesting as is many other things there, so a visit to CHM is recommended.
/Karlsson
Subscribe to:
Posts (Atom)