There is nothing new under the sun…turn, turn, turn

Archive for the month “November, 2012”

OK, what about MORE MORE MORE’s law?

My previous post, Simultaneous Ingress, Runkeeper, last.fm, Spotify – let’s just call it MORES law, has resulted in some Google+ chatter. Specifically, the chatter focuses upon this statement that I made:

Things have changed significantly in the computer world between 1982 and 2012, and they’ll change a lot more between 2012 and 2042.

Upon seeing this statement, Tad Donaghe said:

They’re going to change a lot more in the next 10 years than in the last 50…

Michael Cohen, who reshared my original post on Google+, agreed with Donaghe’s assessment.

And they’re right. Well, they’re right within the constraints previously specified in this blog – despite the changes going on around us, we’ll still have hopes and fears just like the people from thousands of years ago.

You can see that Cohen and Donaghe are right by looking at the past. There was technological change between 1970 and 1980, and there was change between 1980 and 1990, but a valid argument can be made that the change in the 1980s exceeded that in the 1970s.

But why, when the underlying hardware changes were constant, did the overall technology changes resemble something in a more exponential sense? This was examined by Linas Vepstas.

Every year, there is a 5% or 10% or 20% increase in the size of a silicon die that a computer CPU is made out of. The change is not exponential, it is linear, more or less.

Yup, that’s what we said before. But Vepstas goes on:

There are also small, incremental improvements in all aspects of die-making technology.

After listing examples of these incremental die-making technology improvements, Vepstas goes on to talk about other improvements, such as improvements in the machinery used to make the chip, improvements in machine components, and the like.

And Vepstas doesn’t even talk about the things that use the chip – the hardware, the software, and the infrasturcture such as the networks. The improvements in the chip allow all sorts of improvements in other areas. You’re not going to be able to host a graphical multi-font word processing application on a windowed operating system if you only have an Intel 4004 chip.

Ray Kurzweil also discusses this:

Most long range forecasts of technical feasibility in future time periods dramatically underestimate the power of future technology because they are based on what I call the “intuitive linear” view of technological progress rather than the “historical exponential view.” To express this another way, it is not the case that we will experience a hundred years of progress in the twenty-first century; rather we will witness on the order of twenty thousand years of progress (at today’s rate of progress, that is).

Later in the essay, Kurzweil goes on to state:

Indeed, we find not just simple exponential growth, but “double” exponential growth, meaning that the rate of exponential growth is itself growing exponentially.

Kurzweil then uses the term “paradigm shift,” but unlike some people, he uses the term correctly.

The first technological steps-sharp edges, fire, the wheel–took tens of thousands of years. For people living in this era, there was little noticeable technological change in even a thousand years. By 1000 A.D., progress was much faster and a paradigm shift required only a century or two. In the nineteenth century, we saw more technological change than in the nine centuries preceding it. Then in the first twenty years of the twentieth century, we saw more advancement than in all of the nineteenth century. Now, paradigm shifts occur in only a few years time.

Or, when you work through the math,

So the twenty-first century will see almost a thousand times greater technological change than its predecessor.

Then Kurzweil starts talking about the Singularity…but I think I’ll stop there.

Simultaneous Ingress, Runkeeper, last.fm, Spotify – let’s just call it MORES law

I took a walk Tuesday afternoon, and after the walk I got to thinking.

My walk had two purposes – to go by an augmented reality portal that is part of Google’s Ingress game, and to get some exercise. To play Ingress, I had my Android smartphone with me. Although I didn’t need my Android smartphone to walk, it is a handy thing to have because I can then record my distance via the Runkeeper application.

Unfortunately, Ingress doesn’t seem to play well with other apps running simultaneously – at least on my older Android phone – so I didn’t even try to run both Ingress and Runkeeper at the same time.

After I returned from my walk (not recorded in Runkeeper), I began thinking about an ideal smartphone setup. In the ideal world, I’d be able to run Ingress and Runkeeper at the same time. Oh, and while I was at it, wouldn’t it be nice to listen to some music at the same time, perhaps via Spotify? And, of course, I’d want to log all my Spotify activity to last.fm.

How ridiculous would my ideal smartphone computing environment have appeared a mere 30 years ago? Back then, the idea of a smartphone didn’t even exist. The vast majority of personal computer users were performing their work in a text-based, 80 character times 24 line screen. And the users were performing this on a device that was much larger than my smartphone, with many fewer features.

Things have changed significantly in the computer world between 1982 and 2012, and they’ll change a lot more between 2012 and 2042.

One of the reasons for these changes is because computers have become more powerful. Intel illustrates this with a processor-based example. Imagine that in 1970, you walked into a concert hall with a seating capacity of 2,300 people. Forty-one years later, imagine that you could take the entire population of China – 1.3 billion people – and fit them into that same concert hall.

Intel, of course, is heavily influenced by the thinking of Gordon Moore, who postulated that the number of transistors on a chip will double approximately every two years. This concept (Moore’s Law) has driven the expansion of Intel, its competitors, and those who rely upon the chipmakers to power their computers and computing devices.

As a result, the dedicated BASIC computer that I used in junior high school in the early 1970s was eventually able to become the Macintosh Plus, which featured a windowing environment, a program called MacWrite that could beat the pants off of any typewriter by producing text in four different fonts, and the ability to store up to TWENTY MEGABYTES of “files” with my data. And, of course, things have continued to grow – and shrink.

But it’s never enough for us. If someone from the past were to look at one of today’s smartphones, with its ability to communicate anywhere in the world and its ability to tell you exactly where you are, the person from the past would marvel.

But in my case, I just want to allow four complex Internet-enabled applications to run simultaneously in a device that fits in the palm of my mind.

Meanwhile, the person who will be living thirty years from now will be unable to believe that I suffered under such extreme restrictions.

Why call it Moore’s Law? Let’s just call it MORES Law…because we will always want more, more, more. (How do you like it?)

International Oracle Users Week (IOUW), 1982? – 2005

Oracle Magazine recently ran a story by Rich Schwerin that included the following:

Imagine this: When Oracle was still called Relational Software Inc. (RSI), about 50 attendees from a dozen or so companies gathered in a small, windowless room at the San Francisco Grand Hyatt for the first International Oracle Users Week (actually just three days, August 23–25).

From that beginning in 1982, IOUW grew and grew. By 1986, the attendance had increased by an order of magnitude, to 500. IOUW was eventually merged into Oracle’s new sponsored event, Oracle OpenWorld, in 1996, and now has more than 50,000 attendees.

So how long until there are 500,000 attendees at an Oracle event?

It’s interesting to note that there are differing dates for the first IOUW. David Kreines explains the confusion:

The first user conference probably took place in 1985 in San Diego. I say probably, because some Oracle users were getting together informally to discuss the product (version 4 back then), but San Diego was the first time Oracle actually sanctioned a meeting. There were probably 50 users present.

As the years went past, it was easy to identify the differing priorities of the users, who did much of the planning, and Oracle, whose product was being talked about. Kreines:

As attendance grew, so did the quality of the presentations. A group of IOUG volunteers reviewed session submissions and, using objective criteria, carefully prepared a program that would have value to every attendee. At the same time, the Planning Committee came under increasing pressure for Oracle Corporation to add more and more “marketing” sessions to drive their ever-increasing growth.

This is not unique to Oracle, and can be found in any user group that is directly tied to the products of a single vendor. Eventually, Oracle solved the problem by taking over the conference itself. A press release was issued:

CHICAGO–(BUSINESS WIRE)–March 28, 1996–The International Oracle Users Group – Americas (IOUG-A) today announced that as part of an agreement reached with Oracle Corp., the technical education tracks planned for IOUW ’96 will be integrated into Oracle Open World, to be held Nov. 3 – 8, 1996 in San Francisco.

IOUW ’96 had previously been announced as a stand-alone event, scheduled for Sept. 15 – 20, 1996 in Dallas. However, with the integration of the technical education component of IOUW into Oracle OpenWorld Oracle OpenWorld is an annual Oracle event for business decision-makers, IT management, and line of business end users. It is held in major cities around the world. The 2007 event was held in Sao Paulo, Brazil. , both the IOUG-A and Oracle Corp. agreed that the user community would be best served if a freestanding user education conference were postponed.

Just to illustrate how far we’ve come since 1996, the press release concludes with the following note:

EDITORS NOTE: There is a double slash after “http:” in the text above; there is an “at” symbol following “73144.1777” in the text above, and an “at” symbol following “rbrauen,” “76711.1015” and “jbaxter” in the contact information below. These symbols may not appear properly in some systems.

Yes, discerning minds can see that we in 1996 were still not comfortable with all of these forward slash things in web addresses, and we can also see that people were still using their Compuserve addresses as contact information. (I was 72604,2235 back in the day.)

IOUW itself survived until 2005, but Oracle OpenWorld, funded by Oracle, eclipsed it in popularity.

The blue bus is calling us (but not as frequently)

You knew this had to happen.

Once I wrote a post that included the subtitle “the West is the best; get here and we’ll do the rest,” you knew that I would have to write a post that included the title “the blue bus is calling us.”

So, why does a person take his life in his own hands when getting near a Greyhound bus? And why is their service so shoddy?

Technology advances.

Highbeam Business:

According to the Bureau of Transportation Statistics, the number of companies providing regular route intercity bus service had dropped to 50 in 2005, compared to 143 intercity bus companies in existence in 1960. Greyhound was the only carrier to maintain a national network in 2005, although it made significant cuts in its service to the Northwest, Southwest, and south central United States.

The decrease in intercity bus service was largely due to competition from private automobiles and the airlines, which offered a variety of discount fares and added convenience that lured would-be bus riders to take to the skies instead. In the early 2000s intercity buses accounted for less than 2 percent of all long-distance travel in the United States.

Or, as Lily Tomlin’s character Ernestine said regarding a former near-monopoly:

We don’t care. We don’t have to.

A driverless car ecosystem? (Or, the West is the best; get here and we’ll do the rest)

Remember Tad Donaghe, and his thoughts on driverless cars? (I shared some of his thoughts in a previous post.) Well, Tad’s been doing some more thinking, and he has shared some more thoughts in Internet Evolution. His post, “The Age of Affordable Luxury Travel,” talks about – well, I’ll share his post in a minute, but first let me describe a recent personal experience.

I loaned my car to a family member for the weekend recently. Since that person lives in Irvine, this involved driving down to Irvine, having the person drive me to a Metrolink train station, and taking the train back to the Inland Empire. Despite the wide expanses in southern California, and despite the fact that this was a suburb-to-suburb trip (rather than suburb-to-downtown or vice versa), I was able to make this work – with some difficulty. I left work several hours early that day because if I had left work at the usual time and driven down to Irvine, there would not be any available train connections to get me home.

What does this have to do with driverless cars? I’ll get to that.

Back to “The Age of Affordable Luxury Travel.” Donaghe does not live in an outer suburb of southern California. He lives in the Phoenix, Arizona metropolitan area. So if Tad wants to go to Los Angeles, what does he have to do?

He could take a plane, which would involve getting to the airport a couple of hours early, going through security, sitting in a cramped plane for an hour, arriving at the beauty that is LAX (DISCLOSURE: I live near Ontario Airport; we are not happy with LAX at the moment), and then navigating your way to wherever you want to go. And, as I’ve already noted, it’s not easy to use mass transit to get around southern California.

So let’s say Tad doesn’t want to take the plane. He could drive his car from Arizona to southern California. For those who are not familiar with the southwestern United States, that is a pretty desolate drive. I’ve never taken that particular route myself, but I have taken the driving route to Las Vegas, in which an otherwise nondescript place such as Baker, California becomes a relative paradise. I assume Blythe, California is similar.

At this point my European friends are asking, why doesn’t Tad just take a train from Phoenix to Los Angeles? Well, Europeans (and even some Americans) may not be aware that the western United States includes a whole lot of empty space. Unlike the Boston – New York – Philadelphia – Baltimore – Washington corridor, trains are not economically viable here.

Enter Tad’s driverless car. Yes, a driverless car can be expensive, but it’s a lot less expensive than laying some high speed rail track from Phoenix to Los Angeles. And, with a driverless car, you can go anywhere. If Tad wants to go to Ontario he can do that. If Tad wants to go to Rancho Santa Margarita he can do that. If Tad wants to go to Watts he can do that.

You’ll note that Tad’s title references affordable luxury travel. In Tad’s scenario, the driverless car is not stand-alone.

You inform your smartphone that you need a restroom, and in less than 15 minutes, an indicator lights up telling you that relief is near. Another self-driving robo-car has been dispatched from a third-party service; it pulls up behind your car and then, with precision only available to modern robotics, there’s a light shudder as this new vehicle docks with yours.

In Tad’s vision, the docking cars are not limited to restrooms and vending machines. Perhaps a mobile Gold’s Gym could pull up to you. Or maybe a mobile Starbucks. And of course, since futurists often don’t anticipate the adverse effects of their actions, perhaps the docking car could be a drug dealer or a mobile bordello.

I personally have some questions about whether the economics of this would work, but as I said before, it’s cheaper to have a fleet of driverless docking cars than it is to lay high speed rail across the desert.

And Loren Feldman may disagree and have his concerns (for example, what if a disabled person is in the driverless car, and the driverless car system fails and the disabled person can’t do anything about it?). But Google does have some political influence, at least in the western United States, so it’s more than likely that someone will be able to experiment with a driverless car service in the future.

Animal House 1804

I don’t watch a lot of movies, but I have seen Animal House, an account of a memorable year at the fictional Faber College. At the end of the movie, it is revealed that one of the most animalistic of the Animal House students, Bluto (portrayed by John Belushi), would eventually become a United States Senator.

Ignoring the fact that one of Belushi’s co-workers eventually WOULD become a United States Senator, there is a historical parallel to the “bad college student becomes respectable” story.

In 1804, there was a school in Lexington, Virginia named Liberty Hall. Today it is known as Washington and Lee University. Back in 1804, however, an incident occurred in the city involving a college student:

Several students are suspended when a Lexingtonian complains that they “stripped naked in the public street in a clear moonlight night between the hours of 8 & 9” to bathe at a public pump. In August, senior George William Crump is suspended for the remainder of the session for running naked through the streets of Lexington.

The account, on Washington and Lee’s own website, goes on to say:

(Crump later becomes a congressman and the U.S. ambassador to Chile.)

Congressman Crump’s official biography (which omits the streaking incident) is here.

And as for Senator Blutarsky, Congressman Keith Ellison once inadvertently quoted the Senator’s historical knowledge.

ASCII and get the ANSI when U ni(d) code – why EBCDIC is like it is

I was saving a text file one day, and noticed that I could save that file in various formats, including several Unicode formats and an ANSI format.

As I was looking at my options, an old acronym suddenly popped into my head.


This is probably the first time that I’ve thought about EBCDIC in this millennium, so I had to research it to remember what it was. I ended up at Dynamoo.

ASCII is the American Standard Code for Information Interchange, also known as ANSI X3.4. There are many variants of this standard, typically to allow different code pages for language encoding, but they all basically follow the same format. ASCII is quite elegant in the way it represents characters….ASCII is essentially a 7-bit code which allows the 8th most significant bit (MSB) to be used for error checking, however most modern computer systems tend to use ASCII values of 128 and above for extended character sets.

Although not mentioned in this particular write-up, Unicode is essentially an extension of ASCII.

But there is another character standard floating around.

EBCDIC (Extended Binary Coded Decimal Interchange Code) is a character encoding set used by IBM mainframes. Unlike virtually every computer system in the world which uses a variant of ASCII, IBM mainframes and midrange systems such as the AS/400 tend to use a wholly incompatible character set….

Before you say that this is irrelevant, consider that there are still a number of AS/400 systems – whoops, I mean IBM iSeries systems – out there, used in places such as colleges.

Why am I writing about this in tymshft? Because of the origins of EBCDIC.

IBM mainframes and midrange systems such as the AS/400 tend to use a wholly incompatible character set primarily designed for ease of use on punched cards.

So even though no one uses punched cards for computer input any more, there are certain systems that are the way they are because of punched cards.

How Google’s cars promote music…and future competitors…

When a new process or a new technology emerges, it has effects that could not be anticipated by the people who originally developed the new process or technology.

Rather than using a technology example, I’ll use a process example – one that is ripped from today’s headlines (assuming that today’s headlines were actually printed on paper). In the past, the process of voting in the United States required a person to go to a polling place on a particular day and cast his or her ballot. In more and more cases, however, people are voting before so-called “election day,” and are not going to a polling place to do it. This caused an issue for the candidates, since their last-minute advertising on the day before the election would become less effective, since many people had already voted. The answer, of course, is to start airing those irritating campaign ads earlier and earlier, just to make sure that the candidates catch all of the voters before they vote.

Which brings us to self-driving cars, and some more thoughts from Tad Donaghe. If you ever read a blog, be sure to scroll down and read the comments that respond to and elaborate on the blog posts. One such comment, left by Donaghe, points out that his prediction of the re-emergence of online grocery stores rests on his other prediction of the increased use of driverless cars. Donaghe:

I think one of the leading causes of failures of previous grocery delivery systems has been the high cost and pain of delivery using human drivers.

The next time it’s attempted it will be based on robotic, self-driven cars. This will completely eliminate the human labor costs involved as well as human error (there weren’t many car-based GPS systems back the last time grocery delivery was tried).

To read the rest of Donaghe’s thoughts, go here.

This is one example of how a new technology – in this case driverless cars – can affect other industries. Recently on Google+, Donaghe speculated on another change flowing from the driverless car phenomenon – a lot of empty garages.

Many people will simply opt not to own a car at all. I think car “subscriptions” will become popular, where you’ll pay a set amount each month (for less than a typical car payment) for the use of any of a number of vehicles as needed. Other people will just share cars via car clubs.

It should be noted that this is already happening, especially in heavily urbanized areas and at universities. Even relatively small colleges are offering on-campus car subscription services for their students.

So what does this mean, according to Donaghe?

One of the results of this is that the millions of garages attached to homes around the country will suddenly no longer be needed for car storage.

Tad offers his specific predictions here about what will happen with these garages – he sees companies like IKEA getting a windfall.

But I wonder.

When I think about garages, I think about four uses.

First, there’s the traditional use that Tad mentioned on Google+ – garages can be used to store cars.

Second, there’s the other traditional use of garages – they can be used to store everything but cars. My garage, for example, includes a box containing thousands of drinking straws. I’m not kidding. Sometimes you can’t pass up those Costco deals.

But there are two other potential uses for garages, and if you can get the car out of the garage, and if you don’t have any junk in the garage, you’ll start seeing these.

One of these is musical. I’m sure that you’ve heard the term “garage band.” In the days before Apple (and a particular software program), the term referred to a band that would actually practice in a garage. So once mom and dad get rid of the car in the garage, the kids may claim the space and become the next Kanye West or Lady Gaga or whatever.

But there’s a fourth use for garages, and perhaps my mention of Apple reminded you of this fourth use. In the old days, Apple used to be called Apple Computer. They built computers. And when Apple Computer first started, they needed a place to build the computers. Being Silicon Valley residents, the company founders did what others had done before them. They went to the garage.

So, if Tad Donaghe’s prediction comes true and people get rid of the cars in their garages, I expect that we’ll see an increase in garage bands and garage companies.

Perhaps one of those garage companies will rise up and challenge current driverless car champion Google.

And one of those garage bands might write the soundtrack.

The deep future of U.S. Army procurement

I don’t think that I’ve talked about the future a lot in the tymshft blog. Much of my conversation has focused on the relationship between the present and the past.

But there are people who are looking into the future, and one of those people is U.S. Army acquisition official Heidi Shyu. As Federal News Radio reports, Shyu is looking beyond the Army’s current procurements for Iraq and Afghanistan. After that, what next?

Unlike in the 1990s, the threats we face have not receeded. As a matter of fact, they’ve grown more sophisticated. Then you add our reliance on a healthy industrial base for critical scientific, engineering and manufacturing skills that’s essential to our modernization efforts. We recognize that maintaining the Army’s leading edge in the future depends on this healthy industrial base.

In the past, the Defense Department has engaged in a five year planning cycle. (Hmm…just like the Commies used to do.) But now the Army wants to engage in “strategic modernization planning,” with a 30 year time horizon. Lt. General Keith Walker believes that this will be better for the Army:

Our warfight and our focus has been very short in terms of Iraq and Afghanistan. But some of the consequences of that, when you look at the capabilities required to execute those concepts, it’s a very short-term time horizon. So when the department has to plan and prepare a science and technology program, we’re not properly informing that effort because we’re so short. So we will step back and make a very deliberate effort to gather people from academia, industry and our own research and development communities and operators to get an understanding of what may physically be possible in the year 2030 or so, because honestly I don’t know. But once I understand what’s physically possible from the threat and what our own country can do, then we can talk about some concepts that are out there and we can better inform our science and technology effort.

Of course, such an effort assumes that the Army can predict the future with fair accuracy. Military.com raises doubts:

There are those who doubt such a long range modernization strategy as the [Pentagon] has repeatedly highlighted the military’s inability to predict future threats and capabilities accurately. The Army especially has suffered when trying to develop technological advances as evidenced by the failures in the Future Combat Systems program.

The Future Combat Systems program, launched in 2003, included an effort to develop a 27 ton manned ground vehicle. This effort was cancelled by the Army in 2009.

Part of the issue was that funding for FCS was reduced due to the short-term activity in Iraq and Afghanistan. But there was another reason:

The Democrat-controlled 110th Congress has been understandably skeptical of the FCS, whose prime contractor is Boeing, because of its cost, complexity and the vast leaps in the dark on cutting edge, untried technologies that it contemplates.

But by definition, any modernization effort is going to rely on “cutting edge, untried technologies.” This suggests that the most serious threat to strategic modernization planning may be…ourselves. How can we engage in a 30 year modernization plan when the people charged with funding it are in two-year, four-year, and six-year election cycles?

Oh well. At least the government is still better off than publicly traded companies. Their equivalent to the election cycle is the required quarterly reporting. If you…um, tank in a particular quarter, your leadership may be “voted out of office”…and incumbency is of no help here.

Webvan 3.0?

On Google+, Chris Pirillo asked the question:

What are your tech predictions for the year 2021? Yes – over the next ten years.

Tad Donaghe took a crack at answering the question, and offered a number of future predictions, including predictions about the acceptance of driverless cars. (Did I mention that these conversations were taking place on Google+?)

But one of Tad’s other predictions intrigued me.

Most physical grocery stores will begin to go away as most consumers will prefer to receive deliveries by robotic delivery services – think UPS vans that pull up to your house full of the groceries you didn’t have to shop for. You’ll bring them in the house.

At first glance, it seems like a fairly strong prediction. After all, several years ago we thought that Amazon would never take off, and that people would insist on going to a real bookstore like Borders and looking at and buying real books. Now Borders is history and Amazon sells Kindles. So if such a sea change occurred in the book world, why can’t I buy Hamburger Helper and cereal online?

However, whenever one talks about online grocery stores, we have to remember that we went down this path before, in the initial dot.com boom.

By 2001, the biggest player in the online grocery market space, Webvan, was bankrupt:

On Monday, the Foster City, California, company said that it closed all operations and filed for Chapter 11 bankruptcy protection. In the announcement, which came just a year and a half after Webvan’s remarkably successful IPO, the company said it has no plans to re-open.

The cause? Rapidly disappearing cash reserves.

But why did Webvan have rapidly disappearing cash reserves? Unlike some other companies from that era that I won’t name right now, fraud wasn’t the issue.

Webvan’s problems never really had much to do with its customers. It was the lack of customers that was the trouble.

Back in those days of 2001, the Wired article quoted analyst Ken Cassar’s statement that Webvan “may well have been 10 or 20 years ahead of its time.”

Is now the time? Personally, I’m skeptical, as I noted in Tad’s Google+ thread.

I’ll grant that packaged foods easily lend themselves to online purchasing, but I think the jury’s still out on things like fresh food and vegetables. And I’d personally even be leery of buying frozen or refrigerated foods from an online service – how long has the delivery van been driving around before it gets to your door?

But, as I acknowledged in the thread, I could be wrong.

Would you buy from an online grocery store? Do you buy from an online grocery store today, such as safeway.com or its southern California equivalent, vons.com?

Or perhaps you’d want to get your groceries from the Nigerian equivalent, BuyCommonThings.com.

Hmm, Tad may be right on this one.

Post Navigation