Silicon Valley Bank and eVestment Hit a Nice Exit with Nasdaq

People often fixate on the Home Run Deals: the Googles, the FaceBooks and the like. A home run deal is a 50x-100x return on a deal where you get in 1) really early, 2) really cheaply, and 3) an astronomical valuation that you 4) stay with the entire ride without being diluted. It is the stuff of movie magic and business books. But it’s also extremely rare and risky.

FaceBook was able to manufacture a success at a critical time by using Russian money in place of an interim round. Plus they locked up all world markets before positioning their final public offering. This was done at the end of the process, not the beginning. They discovered you don’t wait for things to go up. You force it to go up by controlling everything. Yet it was still a white knuckle ride and there were a lot of very big mistakes that were costly – the pajama adventure at Sequoia was quite memorable and led to quite a bit of trouble for the lad. That’s why he needed the Russian money. And as one VC likes to say, “You don’t screw around with the Russians”.

So how do folks make money in the vast majority of Silicon Valley tech deals? What early investors look for is someone who knows how a business works. They have a strategy to be able to consistently grow a business taking advantage of their expertise and contacts. They have sufficient resources to fund that growth while widening their customer base so as to present a desirable acquisition. And they need more than one acquisition candidate who sees them achieving this steadily.

And a good example of a reasonable and profitable tech venture investment is the recently announced acquisition of eVestment by Nasdaq.

The company raised $19M from Silicon Valley Bank. Over the next six years they did seven acquisitions, allowing them to aggregate the value, obtain several key customer accounts, and present a compelling proposition for Nasdaq to acquire for $705M. Their investor was in it for the long-term as it takes time to create a credible business in the financial sector.

For those who think this was too long to wait for the money, if you took that $19M and put in an annual yield of 9% over six years and compound it, you’d end up with future value of money estimate of around $32M.

Their primary investor, Silicon Valley Bank, made 22 times the future value of that money invested. The rule of thumb is ten times for a smart investment. Bravo to eVestment and Silicon Valley Bank.

Is Google Just Another Uber Bro? Unraveling the Tangled Silicon Valley Tech Geek Myth

The most recent attack on women and minorities in Silicon Valley has arisen unexpectedly from Google. Mounted by an anonymous Google engineer as a “manifesto”, it presents no facts, regurgitates disproven theories on the “biology” of men and women and, most tellingly, blames diversity for upper management’s cancellation of underperforming products at Google.

Many people have already started to address the clear absurdities of this person’s claims, from both an internal and external perspective, along with myriad professional press musings too numerous to mention (try google).

Of special note, the key weakness in this memo is that women and minorities have nothing to do with the lament of Google “demoting the core business” and “killing beloved services”. This is odd, and leads to speculation that this “manifesto” is nothing more than a disguised attack on the streamlining decisions of Google CEO Sundar Pichai. In other words, this guy is trolling people with a red herring of “diversity”, and the real intent is to embarrass Google executives for cutting a “beloved” project. Move along folks – nothing to see here.

But the fact that many people see this rant as factual does merit some discussion. This means we have to dive into history a bit. I know most people have little patience for the past. But it does help to know how we got from here to there. It wasn’t random chance.

There are a lot of women who have worked on technology projects in SV over the years, but you wouldn’t know it because no one writes about it, so no one believes that it happened even though this is a young industry and most of us are still alive. That missing piece of the story leads to the notion that women have not had any involvement in any technology and it’s a man’s world. It’s an absurd notion.

Whenever one sees these attitudes one also sees history has been deconstructed to focus only on one person at the expense of others – unless earlier in the history of the field there were key women who could not be deconstructed, like physics has Curie and Meitner. Those who control the information – tech journalists, writers and amateur enthusiasts – have had an almost laser-focus on men. Why?

In practice, tech readers rarely notice the name of the author on the article, which is why it’s pretty easy to write about hard tech even if you’re a woman. But they do notice who is being interviewed, reviewed, or cited as an authority, and it’s usually a man because, as any editor or publisher will tell you, “That’s what the reader wants”. If this seems circular and under-justified, it is.

The second item is the current obsessive focus on low-level “pipeline” women in tech. While it’s important to get women into the system, it’s actually retaining them at the lead engineer, line manager and director level that matters. There is no focus on that.

Thirdly, it was not uncommon for women to be part of a Founding Team that was funded in the 1980’s. Startup teams were typically at least three people reflecting technology, operations and/or finance, and sales. Even if a woman was not an engineer, she would still be viewed as an equal Founding Member for her business, marketing or sales skills. This was also true of black men and women, especially in business and sales due to the their strong presence in old-line companies like IBM.

This Funding Trinity structure became less common as the Cult of the Geek became a meme in Hollywood as a follow-on to the Western antihero. The story was recast from a team of rather dull startup business equals making spreadsheets and chips and PCs to the lone tech-guy going against all odds to fight the System. It was amusing at the time, but it’s been done to death. How many men are going to write another Steve Jobs movie or opera? I’d rather see an opera about Marie and Pierre Curie. Now that was a scientific tour-de-force love story.

But it’s considered normal storytelling in journalism and entertainment to interview / romanticize / suck-up to men when anything serious is discussed to avoid alienating the reader aka men. Are only men readers, users and developers of tech? No, they aren’t. This unquestioned assumption perpetuates the notion that women don’t work in the field, aren’t interested in studying or reading about tech (or science, economics, politics …), and that men are the only instigators of creativity and change. It’s lazy writing, but it’s easier to meet Internet deadlines when you write by recipe rather than by the old-fashioned research / rewrite / review.

Coincident with this fascination of the lone geek, the tech people who rolled out of Berkeley and Stanford at that time found themselves in a rather unpleasant quid-pro-quo: to get a good reference, a student might have to spend considerable amounts of personal time on unpaid / low-paid tech projects. Since jobs were rather scarce (we went through several big recessions in the 1980s, kids), and a reference was really important to a decent job, there were lots of people willing to do this. It became a bit of a seller’s market. In a seller’s market, choices often become based on whim and comfort, and that’s exactly what happened. A like prefers like situation developed among key professors and their lowly student help to reduce management overhead and increase their collegial network. It’s human nature to seek familiarity and comfort when excellence is a commodity. This myth of “someone like me is easier to manage” prevails today.

There was also considerable selection bias in computer science and engineering majors in the 1980s. At Berkeley, there was so much demand for engineering one had to compete to enter as a freshman in the college, which precluded people who were unsure from entering the major. A woman who wanted to be an engineer had to not only know how to apply directly to the College of Engineering at Berkeley, but also have the confidence and will to be an engineer despite the high school tendency to channel women towards the “softer” majors (if encouraged to go to college at all).

The safety valve on learning programming was the few restrictions in the 1980s for non-major students taking courses in CS at Berkeley, and those could be easily waived by a Dean. Berkeley tightened this loophole in the 1990s due to budget cuts, essentially cutting out many people who the decade prior could still take CS courses while in other majors. This led to an EE/CS bottleneck. Stanford had a much smaller pipeline, as do most top-tier private schools. Berkeley was the big one for matriculating people in the field in SV, and its stranglehold on access had a profound impact for two decades.

This skinny-pipeline reduced-risk preference of “guys like me” also was the golden ticket to investment referrals. The hard tech innovation that flowed out of universities – from Berkeley Unix, to RISC, to databases, to languages – was a lucrative and exciting opportunity for people who resented the indentured servitude of academia. They left to fund startups based on these technologies. And the most skilled at these technologies were the same people who had been most willing and able to do work for nothing. Stanford, sensing an opportunity, actually refined the pipeline for investment, offering students access to alumni referrals and networks for a “piece of the action”, and reaped a windfall. Berkeley, in contrast, retreated further into academic narrowness, resenting the desertion of so many into the very industries they helped spawn.

The reason we are seeing discontent today has two key factors: 1) the ability to access excellent introductory and focused courses in programming at a cost-effective level is within the reach of many and 2) the value of a EE/CS degree has declined. The latter is a result of SV growing to encompass mature industries and verticals. Unlike twenty years ago, it isn’t particularly important for every programmer to know how to write a compiler or understand graph theory, and many excellent programmers are self-taught, strangely enough just like many of the early SV pioneers. Most programmers and engineers also work on extant projects, adding some code here, fixing a bug there, and rarely work on a new project or technology de novo. There is more demand for Stanford business school graduates to manage logistics and funding than Berkeley CS programmers to create new technologies.

In addition, the reliance on global access to talent has had an unexpected effect. The number of women from other countries with STEM degrees working in companies in SV is quite high. The majority of women I meet at women-in-tech events (and by this I mean hard tech since that is my field) is dominated by foreign researchers, programmers and engineers from India, China and former Eastern Bloc nations. There is far less stigma for a woman to go into a STEM field in these countries, and it shows in practice. These women are educated, ambitious and not afraid to speak out.

The American women I see at technology events are most commonly clustered in the data science area, and often possess advanced degrees in STEM fields. They are comfortable with data science because many STEM fields work with very large datasets and the tools, techniques and processes are the same when one is analyzing weather patterns or consumer patterns. There is also a reemergence of the value of biology, physics and mathematics degrees in biotech, aerospace, and fintech, respectively. In all cases, the calibre of talent is high and increasing.

The preference of companies like Google for obtuse whiteboard quizzes from upper-division CS classes over that of work, references and experience to validate “fit” reinforces a “CS-degree from top-10 university” bias that is obsolete in industry today. It also has the effect of favoring hiring of recent college graduates over those who have more experience.

Most of the tech pioneers – women and men – who actually did accomplish interesting projects / research / startups / technologies in the prior generation would be weeded out of the hiring pipeline today because while they had a heck of a lot of experience working on technology projects, they didn’t spend their time studying code quiz books. I have a Berkeley physics degree. While it’s a plus to people like Elon Musk (who also has a physics degree) in emerging industries of new space or electric vehicles, it is a minus at Google, FaceBook and other SV new old-guard companies. That is how their metrics and processes work, and they’re happy to keep it this way.

But are they really happy? Is this stasis good for their business? SV management has clearly not kept pace with the social changes in our industry, preferring nostalgia and a “that’s how it’s done” attitude to on-the-ground knowledge and change. This is the same pattern that emerged in the prior generation of old-guard companies of Xerox, Bell Labs and IBM, among others.

The hard truth is many successful SV companies are stuck in a midlife crisis where doing things the old way and fitting in is more important than challenging extant processes, technologies and business models. When this occurs, the time is ripe for a paradigm shift. This is now happening, and it’s making a lot of folks very uncomfortable. They lash out. They blame others. They want thing to go back to the way it was.

To sum up, the requirement of unpaid labor and selection bias by EE/CS professors on key projects at Berkeley and Stanford in the 1980s, the tightening of the pipeline due to budget cuts in the 1990s, reducing the ability of men and women not already declared in the major to “try” programming, the increased reliance of investment on innovative technology startups through this narrowed academic pipeline via referral, and the tech press fascination and support of male enthusiasts who reinforced a Cult of the Geek led us to what we see today – a peculiar devout belief that programming is a man’s job. And that belief is threatened by the sheer number of women in SV now clamoring for a seat at the table.

Sweeping aside all the vanity, programming at its core is working with words in a stylized manner to achieve a desired function. I’ve always found programming more akin to writing a sonnet in terms of the structure than prose. Fixing code is like writing a limerick. It’s not male. It’s not female. It’s just a tool, no more male or female than a pencil. We spend a great deal of time teaching kids in school to learn the tools of language, writing, mathematics and science. Programming is just another tool, with no special or endowed gendered significance.

I think I’ll go write a sonnet. It’s been a while, but I still know how.

Google Cloud OnBoard San Francisco: Buried Alive by PR

There are times when a seminar or conference or training session induces trepidation because the expectations are high. One questions whether it was worth the time to travel to the destination, wait to park in the wreck-a-lot, find the coffee urn empty, and then find a chair in the back where you can barely hear the speaker. All the while, slack messages are building up at home base. Is it worth it?

I’ve always found a reason to make the trip worthwhile – a small tidbit of knowledge, an off-the-cuff experience, an interesting speaker. Sometimes I run into an old colleague and we chat over lunch. Maybe even something *new*.

Then there was Google Cloud OnBoard San Francisco. This conference did not meet expectations. And given the stakes in the battle for the cloud between Amazon, IBM and Google, Google must excel. It did not.

Google advertised this conference as an all-day in-depth technical “training” session on the Google Cloud Platform (GCP). Anyone who has been to AWS conferences knows what that entails: a keynote on where the cloud is heading by an executive including *numbers* on pricing and trends, a set of overview talks on the technology, and then breakout sessions on specifics from hardware to apps so the attendee can focus on their specific expertise.

Google offered none of this.

Instead of an informative keynote by someone who matters on the trends and reasons for using Google Cloud, Google offered weak PR. Instead of analysis, Google offered bluster. Instead of technical expertise, Google offered mean jokes with a smattering of contempt for their audience.

And instead of “training”, Google offered a bait and switch “try and save” set of random slides presented by an obnoxious Henny Youngman wanna-be who would cut off the few good technical speakers with “Hey morons, even my wife can do this” remarks which were so annoying in their constant and disruptive repetition that the audience became more and more irritated and combative during the few times they were allowed to ask questions.

Of course, beyond insulting the men and women who just happen to be programmers, engineers and managers with the “my wife” spiel, the icing on the cake were the snarky responses to genuine audience questions comparing AWS issues to Google cloud, something any attendee would be hard-pressed to answer at the weekly engineering team meeting. “Yes, boss, I spent the entire day at the Hilton in SF, and all I got was this lousy t-shirt” is not a satisfactory answer for any company engaged in serious work in the cloud.

The irony of this all this hand-waving by Google’s inept training staff is, if you can actually glimpse behind the curtain, there is some damned fine technology and thought put into Google Cloud.

You wouldn’t know that to hear the folks pitching it. Except for two competent people, a younger man and an older woman, who squeezed in a few technical discussions before they got shut down by the “host” of the event, it resembled nothing more than a mandatory cheer assembly for the losing high school football team at “More Science High”.

Two pluses. One: Since the training conference was a bust, we worked through the Qwiklabs “free” training kit (in record time I might add) of six hands-on “labs” on “GCP Fundamentals: Core Infrastructure” (Getting Started, GAE and Datastore, GCS, GKE, GCE, BigQuery). These were clearly ripped from a Spring bootcamp on Google Cloud and were rather haphazard, but far more informative than the actual training session. Two: If we passed Go (all the labs), they promised us $200 Google Cloud dollars.

We shall see if the GCP bucks are worthwhile or as useful as monopoly money.

Amazon is a company that decided to monetize their own product for other enterprise customers. In Microsoft parlance, they “eat their own dog food”. This means if their customer suffers, they suffer. It also means as an early adopter, their technology is arcane and hybridized. But they understand their customer, are willing to “buy in” to hold the customer, and constantly advance their reach.

Google can afford to learn from Amazon’s mistakes and make a *clean* Cloud – efficient, effective and reliable – at a cost-competitive advantage. But their attitude towards the very people they need to woo away from Amazon, to put it mildly, stinks. They’re arrogant, abusive and vulgar – even to their own technology staff.

And that is worrisome to any engineer or manager betting the company on a system that works all the time, every time. If you can’t trust Google to be serious about where you store and access your critical data, why invest the time and money in moving to their platform?

My compliments to the chef on the roast beef sandwich at the Hilton. It was definitely the high point of the day. The hotel staff were also quite pleasant. A hospitality business understands that the experience matters to a customer. A bad experience, and they may leave a bad review and never return.

Get your act together, Google.

Hello world!

After a non-brief hiatus where health matters intersected with work matters, I’m back to writing about technology, policy, people and innovation in Silicon Valley.

I’ve always lived in Silicon Valley. I was born in Fremont, got my physics degree at UC Berkeley, and have worked at and co-founded several tech companies here. I road my bike through orchards and fields now filled with homes and shops. I drove 2 lane roads now turned into always busy expressways. I went to school with people who have gone on to successful careers, even reshaping industries… and some who are no longer alive.

I’ve lived in Los Gatos for the last two decades. It’s a nice town (really and literally, the Town of Los Gatos), with a splendid library, just out-of-the-way enough to participate in Silicon Valley without enduring too much of the transitory madness of chimerical tech trends.

It’s an easy drive to all those places that matter, although those places have changed too. What was hot is soon not. Money changes hands, or vanishes into pockets. And the must-hear pitch is as quickly forgotten as yesterday’s weather.

There are patterns and anti-patterns to Silicon Valley, this Valley of Heart’s Delight. But now the heart is made of silicon and transistors and zeros and ones. It beats in picoseconds through cores of processors and devices. Thoughts and dreams and desires, both subtle and base, are accessible with a touch.

A. C. Clarke said “Any sufficiently advanced technology is indistinguishable from magic”. Perhaps this is why the gap between science and public policy, education and superstition has grown in the United States. This is a threat to Silicon Valley innovation and national security.

As the complexity of technologic and scientific innovation increases, those who “productize” and “monetize” innovation have successfully hidden much of the actual scut work of design, development and manufacturing from prying eyes and in some cases, regulatory fingers.

Corporate HR blacklists to protect Silicon Valley monopolies, creations of fictitious businesses and paper transfers of technology IPR invented here to bogus tax havens, and the hands-off creation and supervision of manufacturing facilities in poor countries with the inevitable stories of abuse and exploitation all paint a less than flattering picture of a “maturing industry” disinterested in innovation.

But where does one go when the “renovation, not innovation” mantra becomes tiresome, and magic doesn’t cut it?

Complex products made “simple” are inherently deceptive in both design and intent, but not because it makes technology accessible. Rather, the cult of “simple” design obscures technology as magic. And once technology becomes magic in people’s eyes, the value for the foundational knowledge and experience underpinning innovation is lost.

So we come full circle.

Can we “Tawk”?

Phil Bronstein today asked the unmusical question “What Tech Buzzwords Make You Go, “Huh?”. He brings up terms like “interstitial” (like, look it up buddy, it’s in the dictionary) and “open source” (if you don’t know this one by now, you’re doomed).

But what if the technical term is, to put it delicately, eff’d up?

A story from the legendary editor of the late great Dr. Dobbs Journal, Jon Erickson, told to yours truly to illustrate: One of the cover stories was on Thompson AWK language and as editor he set the enthusiastic tone (yes, some folks get really excited at the thought of AWK) with “TAWKing with C++”. However, somebody wasn’t minding their p’s and q’s (when they actually did mind p’s and q’s). When the magazine cover came back for final review it said something slightly different – “Twaking with C++”.

I don’t know if meth-heads read DDJ, but Jon wasn’t too pleased. Reportedly everyone could hear it thrown across the room and wham into the door. Oops.

Later, as a joke, the staff put together a fake cover with another “twak” reference. This is why journalists are heavy drinkers and why editors have short tempers.

SpaceX and Open Source – The Costs of Achieving Escape Velocity

The successful low-earth orbit of a Dragon capsule mock-up by the Falcon 9 rocket was a great achievement by SpaceX last week (June 4, 2010) and a harbinger of the new age of private space transport. As I watched their success, the excitement from the press and space enthusiasts, and the unexpectedly vindictive response from many inside NASA, I was reminded of the launch of 386BSD – and why those most able to understand your achievement often are the most parochial.

Space exploration is a family tradition for the Jolitz clan, starting with William L. Jolitz developing transponders and thin and thick films for many spacecraft at Ford Aerospace (some still transmitting telemetry long after his passing) to his son William’s work at NASA-Ames on an oscillating secondary mirror for the Kuiper Airborne Observatory as a high school intern, to his three grandchildren working at NASA in astrobiology (Rebecca Jolitz), orbital dynamics and space fatigue simulations (Ben Jolitz) and spacecraft logistics for science projects (Sarah Jolitz).

Ben and Rebecca Jolitz also had the opportunity to meet Elon Musk, founder of SpaceX, at the 2007 Mars Society Conference held at UCLA and were inspired by his vision and determination (it was at that conference that Ben decided he wanted to major in physics at UCLA). Though still in high school, they had recieved honorable mention in the Toshiba ExploraVision competition for their highly creative concept of a Mars Colonization Vehicle based on using an asteroid in a controlled bielliptic orbit as a transport vehicle to provide the “heavy lifting” of supplies and personnel between Earth and Mars. They were invited to present a more detailed talk on their project for the Independent Study track. The speakers at this conference were a great spur to their scientific enthusiasm.

So it was with great satisfaction that I watched SpaceX demonstrate that “rocket science” is no longer the province of great nations but instead will bring about a democratization of space – cargo and transport, experimentation and eventually mining and exploration.

This is no quick path, however. The struggle for open source software, beginning with Richard Stallman and his remarkable GCC compiler, Andy Tanenbaum’s Minix system, Lyon’s careful documentation of version 6 Unix, our Dr. Dobbs Journal’s article series on 386BSD Berkeley Unix and subsequent releases and Linus Torvald’s amazing synthesis of the prior Unix, 386BSD and Minix works to achieve Linux occurred during a enormous burst of creativity that actually totaled about five years (1989-1994). After this came the long process of usability design – driver support, GUI support, applications support, new scripting languages – which is still a work in progress after another decade. Big vision projects take a lot of time and are not for the timid.

It is no secret that NASA has been struggling for many years with a lack of purpose. Just like Unix in the mid-1980’s and Windows in the mid-1990’s, technology which is held too tightly to a single group or company or national agency tends to calcify. Innovation becomes too risky. Agendas and interest groups override design decisions which may theoretically impact their funding. It becomes easier to add to a design than subtract from it, resulting in an unwieldy project which never converges in form or function.

Eventually, more effort in put into maintaining the flaws than eliminating them. Bugs and unexpected interactions begin to dominate, resulting in more meetings, workshops and conferences. Tools to manage the side-effects and flaws of the project become the object of research, while the actual project suffocates as it becomes more and more obese.

The life cycle of an operating system, like the life cycle of a space exploration vehicle, encompasses a brief burst of risk-taking and innovation followed by a long series of “rational” decisions which add heft and gravitas, followed by bloat, loss of purpose and final collapse. But during the long period of bloat and dementia, the lack of satisfactory execution provides an opportunity for newer faster designs leveraging new technologies in other fields to pry into previously unobtainable market niches and slowly eat out the old markets. This happened with open source, and it is happening with space exploration.

The shuttle itself is over 35 years old and encompasses aging technology which can no longer be retrofit – and has been long scheduled for decommission. This schedule has been put off again and again for two reasons: 1) the US has refused to properly fund and schedule a replacement because the costs and commitment are very great and 2) the maintenance and rocket groups are based in key states dependent on continual funding. Politics as usual has been to fund existing projects when we are long overdue to redefine NASA’s mission and goals. And, as is often the case, in refusing to examine other options, we have been left with only one option – end the shuttle program and depend on other nations and consortiums for transport – Russia and the European Space Agency primarily.

The Bush-era Constellation program was in theory supposed to provide an alternative, but the results of this program were laughable – it became a symbol of a bloated self-referential insatiable rocket bureaucracy that couldn’t build a real rocket to get pizza, much less get to the moon. And there was the tragic side to this – so many Americans love to complain about their government by saying “If we could get a man to the moon, why can’t we do” whatever, when in reality America lost that ability over 25 years ago with the decommission of the Saturn 5 rockets. Since then, like many other government makework projects, “rocket science” has devolved to fantasy powerpoint presentations and one-off prototypes that might have been flown, except for the risk of failure.

So while the science side of NASA, with their unmanned probes and experiments and space telescopes, has continually advanced despite the occasional loss, the rocket side has cowered, fearful of failure yet addicted to the status quo of “no risk = no failures”. And this stance, while appearing to play it safe, has created more opportunities for the SpaceX’s of the world as space transport, satellite maintenance and other niche markets look for more effective and less expensive approaches.

Competition, we are always told, is good for America. After all, it was competition with the old Soviet Union that launched the space program – and the need to hire rocket scientists to get us up there. So in principle NASA’s rocket guys should be pleased with SpaceX – they can leverage SpaceX’s experience while encouraging their own demoralized workforce to become more innovative. Like open source, the knowledge that “it can be done” should provide both a relief to fear and a spur to greatness. It’s a win-win, right?

So why the malice and anger? Why did so many within the agency that could most benefit from this knowledge wish SpaceX ill? Why are they running down their achievement? Why aren’t they rising to the challenge? Aren’t they eager to break out of their repressive paradigms?

While envy and fear of change play a great role here, the loss of status is most pernicious. During the rise of open source, a new set of designers and developers began to set the pace for innovation. Many programmers frustrated in their work in industry found an outlet in open source. An avalanche of ideas – good, bad and indifferent – could no longer be repressed by groups controlling proprietary operating systems source. These groups – corporations, standards committees, technology “gurus” – derived much benefit from the old system. They were the leaders at conferences, the movers and shakers of agendas. More than even money, they had the power to elevate or destroy ideas and people on a whim. And believe me, I saw what happened when people didn’t “get with the program”. It wasn’t pretty.

When 386BSD was born, I was told by many in the hard-core Unix side that it would be “strangled in the cradle” – either by lawsuits (which of course, never happened) or by ridicule (which did occur, constantly). I didn’t believe it. I just couldn’t believe that the experts I knew in the biz would wish it ill when they had an opportunity to finally work with BSD without all the proprietary license rigamarole. For years I had heard people complain about all the agreements and licenses and restrictions and “If only it were unencumbered”. Now that they had their wish, wasn’t it great?

Boy, was I misled. What I saw as an opportunity, many other good talented people saw as a threat to their comfortable professional existence. I understand comfort, and I never wanted to make anyone unhappy. But in giving them what they had wished for, I did make them unhappy, because I also gave it to everybody else – and that was inconvenient. Well, I plead youthful enthusiasm here to misunderstanding their desires. But if given the chance, I’d do it again, because it was the right thing to do – even if I did it the “inconvenient way”.

So what were the claims? I was told nobody would use open source because it didn’t have a big company behind it – and we see today that was wrong. I was told that nobody would make money off of open source – and today we see many companies developing profitable businesses off of support and new design. I was told that nobody would use open source to innovate, and yet I use entirely new applications and languages that were not even thought of at the time Dr. Dobbs Journal launched the “Porting Unix to the 386” series in January of 1991. I was told that the only way to distribute software was by selling it on a disk, and that we were crazy to put it out on the Internet, and yet now this is the way even proprietary software is distributed. When I talked about Internet-based OS’s, I was literally laughed at by experts I respected – and it hurt – but now we see the beginnings of the “webOS”.

The ridicule did have real and lasting effects. The constant intimations by Unix groups of pending lawsuits that never arrived but always “loomed”, the personal strain caused by creating entire OS releases on a shoe-string budget funded mostly by writing articles and refinancing our house while raising three young children, the ever-escalating expectations of a consumer audience demanding a commercial OS with all the bells and whistles dissatisfied by traditional Berkeley Unix research releases (with the traditional demands of self-administration – in a “damned if you do, damned if you don’t” moment I actually insisted the Dr. Dobbs OS release installation and administration be automated by default, with the traditional installation process selectable if desired, and was then ridiculed for not making them do it the hard way – sigh) and finally, the relentless badmouthing of any new approaches in the kernel – the raison d’etre of Berkeley Unix but not, admittedly, of a commercial corporate proprietary system. The last of these was the hardest to bear, frankly – and I understood why many other designers, seeing this, fled to Linux. After all, the ridicule, badmouthing and blacklisting was a piece of what they had experienced in their companies, so why endure it in a supposedly “open source” project?

So like 386BSD, the NASA badmouths and their corporate masters could potentially destroy SpaceX. Yes, SpaceX is better funded than a two-person project like 386BSD – our original “Falcon 9” rocket was a 300-400 kbyte kernel plus some apps (386BSD Release 0.0) and 17 5,000-10,000 word articles plus code on how to do it yourself – but getting out of a gravity well of Earth, not to mention the psychological gravity well of believing you can do it (which seems to be more like Jupiter in terms of magnitude) is a heck of a lot harder. Ridicule, the inevitable technical setbacks SpaceX potentially faces, liability laments (ah, there’s that “lawsuits pending” stuff again), a steep learning curve, American impatience (doing new releases with some new innovative work in the kernel took us about 8-12 months – doing the next stage in rocket / capsule design will take longer) and media disillusion when the audience fades (no audience = no money) add to the burden.

But even if, somehow, SpaceX is marginalized, their accomplishment are *real* and will spur others to try. Linux was able to grow and thrive during this time precisely because it was *not* an American project – based in Finland, the canards thrown at 386BSD were deemed irrelevant to Linux. Linux was a safe haven to many serious programmers disillusioned with the threats, lies and distortions promolgated around Berkeley Unix precisely because it was an outsider, uninfluenced by other interests.

People of ill will can kill an innovative project for a while. But they can’t kill the idea on which that project is based. It may be delayed for a while. But somewhere, somehow, it will spur others on to try. SpaceX, like 386BSD, is only the beginning.

It’s Raining Cupcakes – And Losses

Internet coupons have been stuck in the dark ages of print. Instead of using modern techniques like social networking and clever psychology (yes, a few companies have done coupon apps for mobile and SN sites like FaceBook, but they’re not very inspiring), most just create “print ’em yourself” coupons to be used at a store. And that’s a hassle. So to compensate for the annoyance factor, coupons delivered in this manner generally offer steep discounts.

Groupon has taken this a step further – offering really steep discounts on premium items *if* they get a set minimum participation (like 100 customers). But what if *too* many people agree – like three thousand? This happened to a tiny boutique cupcake vendor in SF recently, and it was three weeks of agony and spot buying of supplies to satisfy people. Was it worth it? Probably not, since the vendor had to pay more to satisfy customers paying less.

It’s ironic that a decade after the Internet bubble and burst, a simple thing like vending a coupon is an enigma to companies and customers. I’ve done work in this area, and believe me – the level of cleverness and innovation here is very very low. This is partially because of the demographics to which the old media group is wedded – older frugal housewives – and not the sexy 18-34 spendthrift guys dearly beloved by, well, most everybody selling high-priced junk and low-priced junkfood.

But for the poor cupcake vendor who got too much business for too little profit, I only have pity. No small business can scale to cope with flash sales nor offer the kind of personalized attention that creates recurring customer sales. And the customers don’t see the boutique aspect of an artisan – only a cheap discount on cupcakes they might have bought at Safeway instead.

The Internet is a very powerful sales mechanism. Too bad people don’t give it the serious consideration it deserves with respect to the simple coupon. I think there’s a lot of money on the table and nobody wants to pick it up.

Delusional mom or out-of-control government agency?

A toddler is snatched by TSA officials from a weeping helpless mom in the middle of a busy airport and wisked away. Nobody helps. Nobody cares. A horror for any parent. But is this story true? Is our civilization so depraved and cowed that government can violate every aspect of decency and not be challenged or even noticed? I suspect many good citizens might agree with this – after all, isn’t government bad?

But of course “who watches the watchers”? There’s nothing like evidence to mess up a good story, and evidence we have. TSA released nine different camera shots of this distraught mom demonstrating *nothing* happened to her or her child. Nothing at all. Sorry folks – nothing to see here. Please remember to pick up your shoes and water bottles on the way out.

The fact that TSA had to release this video footage (long, detailed and from multiple camera angles to mitigate claims of “doctoring”) demonstrates how paranoia dominates, and also why the appearance of airport security for the masses is consequently just as ridiculous as the culture.

When I reviewed one of Schneier’s books on security and culture, I was struck by his observation that security is handled in an “overt” fashion… public searches, obvious cameras, announcements, shoe and lotion inspections, and so forth, to provide the appearance of serious involvement. But many of these “glamurity” measures, while juicing up the public, are not the ones that are likely to uncover the real bad guys – remember that a group of determined terrorists took over planes with box cutters – those little blades to cut open boxes – not AK47’s or switchblades or cologne. It was organization, intimidation and the element of surprise that allowed them succeed.

So the greatest concern regarding security in airports isn’t necessarily inspecting baby bottles (although on the basis that a bomb could be slipped into an unsuspecting child’s backpack or grandma’s purse, *everyone* must be searched – see, there’s that “organization” and “planning” stuff by determined bad guys again). Nope, the smart investment is in areas of automated photo recognition (do I know you?), examination of flight records (frequent flier? holiday to Tuva?), purchasing habits (cash or credit card? one-way or round-trip?) and ID (are you who you say you are and why are you traveling anyway?). This means realtime database analysis (a form of “business intelligence” pioneered by guys like Tandem to track your phone calls and credit cards – we *are* a consumer society after all) and lots of digital cameras. Oh, it also helps to have smart police who use their “instinct” to check out things – even though 9 times out of 10 there’s nothing there, there’s always that “tenth” time…

So what’s the moral of this little story? That bloggers lie to get hits? Well, I think that we already knew that. That some women are crazy? Given the road rage I see daily it’s not just women here, but there’s a thick percentage of “crazies” everywhere. Nope, the moral is pretty simple: You are being recorded, and not just from the cameras you see or the cameras the staff knows about, but also from cameras the staff and you *don’t* know about. This data is *collected* and *analyzed* and can persist and be pulled for review long after you’ve had that “claimed” incident with TSA or the janitor. To be fair, it’s unlikely to be reviewed – after all, millions of people pass through crowded airports and this means petabytes of uncompressed data that has to be stored somewhere so the persistence time is likely short. But since claims must be made quickly in a 24/7 Internet world, anyone who blogs that “TSA stole my lunch” yesterday on my business trip may actually face video surveillance footage that either shows the staff scarfing down fajitas or shows…nothing at all.

But why, you may ask, are there so many cameras? Aren’t one or two enough? Isn’t that a “waste” of taxpayer’s money. Not necessarily, because subverting security is something that insiders like staff are prone to, hence like banks the vast amount of data collection revolves around monitoring the workers with access – did she just go around the gate? did he just feel up the customers? did they steal from the luggage? and so forth.

But as a personal observation, I’d like to point out a common sense analysis that doesn’t rely on technology nor expertise, but only relies on an understanding of human nature. I felt the most unbelievable aspect of this woman’s blogged claim of TSA child abuse was that nobody in line at the airport inspection station noticed or said anything during this “incident”. Now seriously, I know this is a paranoid “fear culture” where “nobody helps nobody but himself” (to paraphrase a con man), but do people really think that the woman waiting behind this distressed mother or the businessman just ahead of her waiting on a laptop inspection or the grandparents three feet away are *not* going to notice something as unusual as an agent taking a toddler away from his weeping mom? That during an unfolding drama people waiting impatiently to get to a plane will not notice the delay, press in closer and begin to demand explanations?

This is why this woman’s posting was complete nonsense – it completely ignores that we are social creatures who always want to know what’s happening with others. We comment. We rant. We watch. We get upset. Just as a couple of chimps arguing over a banana will cause the rest of the troop to press in closer, people will get involved – especially if there is a child. Grandma will crowd in closer to learn what is going on, the businesswoman four feet away will express concern for the toddler, a twenty-something will ask to speak to another agent. It is human nature to meddle in the affairs of others – that’s what being social animals is all about.

America is full of problems we need to solve to avoid a distopian future, and misconduct by those with badges does occur and must be dealt with appropriately. But there are also lots of scammers, liars and jerks who feed off of the paranoia of our society and make it look a hell of a lot worse than it is. These bottom feeders destroy trust, blacken reputations and encourage cynicism. Instead of focusing our energy on solving real problems, we are instead distracted by idiots enamored with celebrity. We waste time. We waste energy. We lose as a society.

So while some might wish to dismiss this incident, I’d like to expand upon it as an object lesson in how going too far to aggrandize oneself can result in serious blowback. And I’d rather see a fame-obsessed woman trying to get a blog audience to raise her google adwords paycheck exposed as a liar and use this lesson to engage in a discussion of real security needs than see the converse – that in a crowded airport nobody would come to the aid or even question essentially the official kidnapping of a toddler. That so many people are still willing to believe the worst here despite evidence to the contrary says everything about trust in our democracy.

The Number You Have Dialed, “S U N” is No Longer in Service

Sun Microsystems is gone. It is no more. It has met its maker. It is pushing up the daisies.

Given Sun’s long sad decline and incredible mismanagement, many are probably happy to dismiss it as a has-been that never actually did anything – grave dancing is a peculiar Silicon Valley tradition. But Sun’s demise does matter. Sun was the annoying colleague that was occasionally brilliant and creative but also had some very irreligious and disreputable habits that were unforgivable but too often forgiven. As it aged, it became a sotted gouty Henry VIII of Unix, irritable and tyrannical.

But there are also the memories of a young strong idealistic Sun, freshly spun out of Berkeley and eager to take on King Log IBM and DEC the Usurper. We shared the same roots – Berkeley, BSD, courses, research. We all bumped shoulders in the early days of Berkeley Unix and earnestly argued over technical proposals and RFCs now long forgotten. We left Berkeley to go out and build entire operating systems and computers, invent languages and protocols and processors, and create new businesses – and we fought for each and every dollar and technical advantage along the way. It was a blood sport, and we enjoyed it.

Several years ago I was talking to a student at the Vintage Computer Faire about the Symmetric 375 and Berkeley Unix. I had put together a board illustrating the birth of a venture-backed computer systems startup for those too young to know – photos of the empty offices, prototype wirewrap boards, checks to AT&T for Unix licenses and a tape of System V which we never used because we used Berkeley Unix, biz plans, reviews, articles, investment prospectus and materials, technical drawings, product materials. As I went through the life cycle of the investment, the systems built and the market created, he was fascinated in a “Gee, this is King Tut’s tomb” way. When I finished, he started to go into the usual GenX I-don’t-care mode, saying “Well, it wasn’t a Golden Age, but…”. Then he stopped, thought a moment, and corrected himself – “Actually, it *was* a Golden Age, wasn’t it?”. In a “new age” of marketing gimmicks and established players where innovation is considered bad form, I could understand his confusion. He’d missed out on all the fun.

So raise a glass to the Golden Age of Systems and the Demise of Sun. But do not mourn overly much – there will be other Golden Ages – but this one has most assuredly passed.

Myths and the Need for Innovation

It all started when one person asked a very simple question: Why can’t we reduce packet drops throughout the network during congestion events (thus reducing the impact of RTT) with a more intelligent network that is able to refer back to caches of such information from a prior hop and resend, so that transparently the drop in the fabric is repaired? This all seems pretty simple, and yes, I’ve proposed such a mechanism myself. It is doable. Why not try it out? The usual “old” answer is that we don’t need to do anything. After all, everything we need to know about the Internet is already known, and this isn’t a problem. But is this true? Nobody knows for sure, but it’s a good way to stifle questions, isn’t it?

This is not a trivial issue. I see these technology debates springing up all over the research and development landscape, from operating systems to networking to applications. And I see the same answer tendered: shut up, we’ve already solved the problem, and if we stomp out the questioners, the problem won’t exist. This isn’t really a debate between the “old” versus “new” (some “old” designers are among the most innovative and creative people I’ve ever met), but more fundamentally, centers around the ability to question fundamental assumptions in an intellectually open and honest manner. In other words, the battle centers on the purveyors of myth versus the questioners of myth. And reputations are made or broken on the results.

In Matt Miller’s The Tyranny of Dead Ideas: Letting Go of the Old Ways of Thinking to Unleash a New Prosperity, Miller posits that Americans have become so unthinkingly accepting of their myths that they do not question things even when it defies their own experience. Miller views technology as one of the drivers out of this malaise. Unfortunately, the tendency to cleave to myth is not just the province of bankers, politicians and voters. And the consequences for abandoning reasonable discourse and proactive work can result in unanticipated disasters.

Scientists too are prone to this all-to-human tendency to discount uncomfortable data in favor of desired results, even if those results are based on faulty or incomplete data. And woe to those scientists who cater to the desperation of others in an attempt to aggrandize themselves. Witness the recent Office of Special Masters of the U.S. Court of Federal Claims (aka “the virus court”) ruling that MMR and thimerosal do not induce autism – the initial data presented by Wakefield stating autism and MMR shots were linked has been definitively demonstrated to have been fabricated for financial gain, yet there were some other published studies by other scientists that claimed the same results. Only after very large serious studies did this claim get disproved, but in the meantime children who did not receive the vaccines because of scientific validation early on have suffered or died from these very preventable diseases because of a bogeyman of autism (which to say the least doesn’t kill the patient). People were desperate for a cause, and instead of saying “We don’t know”, some scientists told them exactly what they wished.

Homeostasis in ideas cripples independent action. People hold off and put down ideas which could be carefully tested and developed in a considered manner because they are threatened by their potential “success” and fear the dimming of reputations and connections. Only when things completely break do people reach for other ideas, and by then (as witness the current financial crisis) it is really very very difficult to repair matters with a reasonable assurance of success. The events are driven by fear and need. During these times of crisis, people are prone to extreme or under-justified ideas – so long as they are simplistic and appear to “solve the problem”. Got a problem with autism. Don’t get vaccinated. Who needs vaccines anyway? Got a problem with banks? Bail them out. Nationalize them. Eliminate them. Go right. Go left. Shoot the messenger. The nuances of medical studies or derivatives and financial instruments are not interesting to people who are fearful and angry. If you think you’ve been living in dangerous times, Miller points out you haven’t even begun to experience how crazy it can get when people lose their mythic lifelines.

So what’s this have to do with the Internet. The Internet is increasingly the *only* source of information for millions of people. Where people once read print magazines and newspapers, went to the library for books, joined clubs and organizations and kept up with letters over the course of years, now many read / view / communicate only via a browser abstraction. A collapse in the Internet due to years of denial and neglect about the nuances of its structure would be a catastrophe to hundreds of millions of people.

As such, it is important to ask how we can improve the Internet *now* without resorting to old myths and relationships that make us feel comfortable. Because the day will soon come when our old assumptions blind us to new issues, and we will allow this grand experiment to fail. And if that day comes, it will not be the reputable or reasoned scientists who’s voices will be heard. It will be the ones who tell people what they want to hear. Is defending a myth worth this price?