Sedate Sunday: Silicon Valley and Post-Cold War Innovation

I came across this essay on Silicon Valley’s ascendency. It’s a bit wordy in some places and only abstractly relates to Silicon Valley. But who can resist an article that merges IPR, Gramsci, Silicon Valley investment, and Bretton-Woods.

I was amused, no matter how romantized some of the the assumptions. Come on, we all know that communism was really just another form of kleptocracy in disguise, just like Prosperity Gospel, unbridled capitalism, and all the other scams. It’s the human condition writ large.

Scams work by promising people things they don’t merit nor deserve in return for becoming their trolls, fan-boys, minions, and various minor demons. At least Maxwell’s demons did some undeniably important work, but most of these lesser types from the Stygian Depths reject pile don’t want to work (hence the “merit” stuff I menioned), nor are they part of the in-group (hence the “deserve” part). They’re also non-too-bright as a rule. But they are useful in aiding the ascent to substantial power and wealth, primarily by flooding the airwaves and empty streets with bellowing monsters, which in turn is covered by a lazy press corp as a meaningful “event” which should be taken seriously by “those in charge”. 

Technology has certainly brought down the costs of this well-established mechanism. You don’t have to print pamphlets to get attention. You can even more cheaply motivate the mob using facebook ads targeted to any feeble-minded demographic, or pull off in-your-face twitter placement with a word from the Big Twit himself. 

Honestly, it makes me long for the good old days of board room shenagins when William and I pitched hard tech companies. And yes, they were just as misogynistic, narrow-minded, and assholish then as now. That hasn’t changed.

It’s just back then there were still rivals, rules, and relationships to manage in the SV investment side. So William and I had a fighting chance. And fight we did. Sometimes…sometimes we made a success — before anyone caught on. Those were amazing times.

Now writers view startups as some kind of historical media retcon — a rather odd combination of Highlander, Fawlty Towers, and The Big Bang Theory (no women allowed, folks, unlike real life). William, who handled acquisitions for Tandem at one point, also had a fondness for Barbarians at the Gate, but that’s East Coast, not West Coast. And despite what folks will tell you, all those hagiographic movies about SV are so ridiculous and boring  I just don’t bother.

But historical fiction about SV will continue to be popular, especially with a polisci or econ twist. So go ahead, and imbibe this one, especially the amusing views of open source development and startups:

“Within even the very early culture of Silicon Valley, a distinctive tension could be discerned between the “hacker ethic”—with its commitment to entirely free and open information, born as it was in a university laboratory—and the entrepreneurial drive to protect intellectual property. This was not a superficial short-term contradiction, but a defining productive tension that continues to animate the entire domain of networked and computer-driven social and economic relationships.”

Gilbert and Williams, How Silicon Valley Conquered the Post-Cold War Consensus

On to one of my personal pet peeves — there was no hacker ethic as described by the authors back when we were putting together various technologies for the Internet and Berkeley Unix prior to the early 2000s. The very concept of a hacker having any ethics is so laughable I wonder that any reputable journalist can type the words without gagging. We were in it for the fun, the money, and kicking over apple carts. Anything else someone tells you is a sales pitch.

Not to say there weren’t hackers back then. Of course there were. John Draper, aka Captain Crunch, was one such example. Back in the 1970s and 1980s, one could still get access to all the telecommunications and tech docs in public libraries and, with a bit of cleverness and elbow grease, hack pay phone, computers, and all sorts of primitive networks. Security was an afterthought in those days. Security is still an afterthought now. However, it wasn’t all fun and games. John was always followed around by men in suits and shiny black shoes at conferences, William noted.

Even 386BSD, which through Dr. Dobbs Journal articles and releases birthed the open source operating system (even Linux used the article’s 386 source code supplied with every issue), was based on a very different viewpoint from the present-day common viewpoint of everything “free”. Berkeley Unix had been licensed for over a decade, yet the vast majority of works which encompassed it were not proprietary. It was inevitable that eventually those code remnants would be removed and replaced.

Yes, the copyleft and RMS were talked about a lot back then with the long-awaited HERD OS expected to roll over everything in the universe and then Marxism would prevail! Gosh, I can barely type that while laughing. And yes, they really did believe they were some kind of Second Coming of the Open Source Proletariat before Bernie Sanders came along and stole their thunder.

This invested belief in the copyleft actually allowed Berkeley and us to work quietly. Frankly, no one expected Berkeley to finally get around to removing most of the old version 6 Unix detritus.

Even William’s and my prior company, Symmetric Computer Systems, contributed code on disk drive management.  And William and I contributed the source code for the 386 port, making Berkeley Unix actually usable.

During this time, I really enjoyed writing the Source Code Secrets: Virtual Memory book with William, based on the virtual memory system from CMU. The CMU Mach project provided the key in a new approach to a virtual memory system, permitting the jettisoning of the old industrious evaluation virtual memory system of a decade prior. It’s a nice piece of work that is much underappreciated.

And of course, when the unencumbered incomplete release was made public, we got creative and wrote entirely new modules to fill in the missing pieces for the releases.

But working on open source and working on proprietary intellectual property is not antagonistic as the author would state. One of my proudest moments was getting my patents granted for InterProphet’s low-latency protocol processing mechanism and term memory. 

The key is understanding what you owe to others and what you owe to yourself.

Berkeley Unix was a long-term project that collected the works of many people. Berkeley handled the release mechanics and integration. Sometimes they did new work, but not always. It was research, mostly paid for by the government. And that means you and me. 

William and I did the port to the 386, contributed code, wrote published articles, and devised new work as a research project. While we received no funding from Berkeley, we did have a lot of fun.

InterProphet, in contrast, was a 1997 startup focused on improvements in latency in networking using a dataflow architecture. Our innovations were funded, we had employees and an office, and we built the prototype and production boards. We developed the drivers and support software. We paid for really expensive proprietary chip design tools.

And we filed patents and held trade secrets. Intellectual property protection was a given in this work. (A bit of advice here: If your engineers decide to deal with bugs in their software by sending source code to the vendor, put a stop to it immediately. It causes no end of problems later.)

We had an obligation to the investors at InterProphet. And we kept our deals with that company. Just as William and I did with Symmetric Computer Systems back in the 1980s. Technology innovation was valued — at least enough so we could get another startup off the ground. It required due diligence and careful maintenance.

The mistake in many “historical” analysis of Silicon Valley innovation lies in conflating the technology innovation of the pre-2000 era with the non-innovative “free stuff” of the post-2000 period. Investment strategies were completely different. Business structures were different. Even financial structures pre and post IPO changed markedly. They’re not comparable. 

There is nothing “free” in using FaceBook, or Twitter, or Google News, or Apple Maps, or a plethora of other websites. And that is by design.

These websites and applications are intended to go “viral”. They must lure in an unsophisticated customer and make the site “sticky” so they can be tracked. Gosh darn, that’s all it was and is about. No innovation required. In fact, invention and innovation were derided. As John Doerr noted back then, it was “renovation, not innovation” that was king. 

And as the author notes, anything related to manufacturing was sent off to China. No more chip investments. No more hardware investments. No more of that “risky” tech innovation. It had all been done. 

I don’t usually call out specific VCs from that time, but John Doerr and Kleiner deserve it for singlehandedly killing an entire generation of technology with a cynical investment strategy. Special mention goes to Google, Apple, and Intel for corralling open source operating system innovation to maintain their profits.

So John and KPCB, and the tech monopolies as runner-ups — I salute you.

People went hunting for content to populate those websites. Youtube for example grabbed the few popular short videos circulating on the web and put them on the site just to appear like it was being used — until it was used through relentless press.

Customer acquisition dollars were high. A flip was six months.

Content was available in many ways. As the printed press conglomerates strove to grab eyeballs, they inadvertently gave their content away while cratering their traditional print advertising dollars. Aggregators glommed onto that content, manipulating the views towards paid ads and “curated” experiences. Video and music content was pirated as well, but entertainment media executives had been down this road many times before, and hit hard with copyright lawsuits. 

Databases of many kinds were publicly available as well, from geolocal map data to astronomy datasets. With that richness of information, the sky was the limit for people putting a front-end on the information. And so it is today.

I remember when Amazon was first funded as a bookstore. I bought a book — a Harry Harrison Stainless Steel Rat book I recall. One of the VCs back then gave me the dark side sell at an investment event: It was all about knowing what you look at, what you want, what you need. And putting that in front of you so you buy it. And Amazon takes a cut all the way to the bank. Privacy? Who cares. 

It took Amazon six years to a quarterly profit.

Think about that. Six years losing money. When a VC starts demanding quarterly profits, dig up Amazon’s pro formas.

Fun Friday: The Race for AI Creative Works Control

In April of 2020, William and I wrote in the Cutter Business Journal an article entitled Moving Forward in 2020: Technology Investment in ML, AI, and Big Data. We focused on three areas: surveillance (monetization), entertainment (stickiness), and whitespace opportunities (climate, energy, transportation). This statement bears emphasis:

Instead of moving from technology to key customers with an abstracted total addressable market (TAM), we must instead quantify artificial intelligence (AI) and machine learning (ML) benefits where they specifically fit within business strategies across segment industries. By using axiomatic impacts, the fuzziness of how to incorporate AI, ML, and big data into an industry can be used as a check on traditional investment assumptions.

[For additional information on this article, please see AI, ML, and Big Data: Functional Groups That Catch the Investor’s Eye (6 May 2020, Cutter Business Technology Advisor).]

But one might be puzzled as to where generative AI tools such as ChatGPT or Dall-E fit in the AI landscape and why we should care about AI art, AI news and press releases, AI homework and essays — even threatened AI music like what’s talked about in 1984 by Orwell.

The reality is these areas utilize easily crawled content available everywhere lying around in the Internet attic. It also takes tremendous computing power to conduct ML and process this data effectively into some kind of appearance of sensible output. Hence, these tools will remain in the corporate hands of the creators no matter what they claim about “open source” — it’s simply too difficult for anyone but a giant corporate entity to support the huge costs involved. So this is about monetization and stickiness. Large companies are willing and able to pay the cloud costs if the customer gets dependent on using their tools. Flashback to the tool-centric sell of the 1980s, Silicon Valley style. All we need is an AI version of Dr. Dobbs Journal and we’re all set.

Previous attempts at generative AI have usually focused on small ML datasets, leading to laughable and biased results. Now companies are looking at the shift at Google in particular from ads to AI, along with Microsoft and FaceBook. Everyone believes they are in a race and frantically trying to catch up before all that sweet sweet money is locked up by one of them.

But is there really any need to “catch up”? Is this a real trend, or just an illusion? Google made its fortune on categorizing every web page on the Internet. It had plenty of rivals back then. I was fond of Altavista myself. But there was also everything from Ask Jeeves to Yahoo. 

Now Google and Microsoft are analyzing the contents of these big pots of data with ML But it’s not just for analyzing. It’s for creating content. Music. Art. News. Opinion. And you need an awful lot of processing power to handle all that data. So it’s now a Big Guy Game.

One of the approaches to eliminating bias is to use ML to process more and more data. The bigger the data pot, the less the bias and error. Well, that’s the assumption, anyway. But it’s a dubious assumption given the pots analyzed are often variations on the theme. Most search categorization is based on recent pages and not deep page analysis. Google is no Linkpendium.

All this, oddly, reminds me a bit of the UK mad cow fiasco, where their agricultural industry essentially cultivated prions by feeding animals dead animals. Like Curie purified radium from pitchblend, the animals who died of this disease were then processed and fed to other animals. And since prions, like radium, persist after processing, the prions were concentrated and made it up the food chain into humans.

So in like kind the tools themselves are feeding back into the ML feedlot and being consumed again. It may take longer than a few days, but we will be back to the same problems in terms of bias and error.

However, the gimmick of having a “machine” write your essay or news blurb is very tempting. Heck, AIs are claimed to take medical exams or pass a law class or handle software programming better than people.

But being a doctor or attorney or a software engineer is much more than book learning, as anyone who’s done the job will tell you.

And of course, there is now backlash from various groups who value their creative works and are not interested in rapidly generated pale imitations polluting their space and pushing them out. They didn’t consent to have their works pulled into a training set and used by anyone. Imitation is neither sincere or flattering, and is even legally actionable if protected by copyright, trade secret, or patent. It’s not “fair use” when you suck it all in and paraphrase it slightly differently.

This isn’t new. William and I ran into this in the old days with our 386BSD code. We were happy to let people use it and modify it — what is code if not usable and modifiable? But we asked that provinance be maintained in the copyright itself by leaving the names of the authors. And we had entire modules of code that were written denovo in the days when kernel design meant new ideas. It was an amazing creative time for us.

So I remember how shocked I was when an engineer at Oracle asked me about tfork() and threading, since a Linux book he had talked about it but he could find nothing in the Linux source code. I pulled out our 386BSD Kernel book and showed him that it was novel work done for 386BSD and would not work in Linux. Upon discussion it turned out that book just “paraphrased” many of our chapters without even considering that Linux did not incorporate much of our work because it was a very different architectural design. It misled software designers — but I’m sure it sold a heck of a lot more books than we did by turning “386bsd” to “linux”. So it is today, but a heck of a lot easier for the talentless, the craven, and the criminal to steal.

Now many software designers are upset because their source code depositories are used as the models for automated coding, and they don’t like that one bit. And I don’t blame them.

We lived it. And it was a primary reason why the 386BSD project was terminated. Too many trolls and opportunists ready to take any new work and paraphrase it. So get ready to see this happen again in music, art, news, and yes, software. The age of mediocrity is upon us.

1984, here we come…

Fun Friday: Twitter and the Age of Anti-Innovation


“One lesson that has to be remembered in my line of business is that when an operation is over it is OVER.The temptation to stay just one more day or to cash just one more cheque can be almost overwhelming, ah, how well I know. I also know that it is also the best way to get better acquainted with the police. Turn your back and walk away – And live to graft another day.” The Stainless Steel Rat, Harry Harrison

Well, I wasn’t going to talk about Musk, but I’m a bit jealous. First he subpoenaed Stanford University about twitter’s 1995 origins  — a university he claims he spent all of two days at in the materials science engineering PhD program at that time. Then he up and forgot he was going for an interesting Silicon Valley history lesson and decided to buy the company anyway. Sigh.

Perhaps he gave up because he skipped out on paying Stanford their exorbitant tuition and fees by not enrolling, and he’s worried they still have the bill. Actually, this is very possible — my own father attended Stanford and left with a $100 owed them. A generation later, when his son got admitted, Stanford still remembered. Academic debt is eternal. But the boring story is Musk got a better deal and frankly, I don’t remember twitter as an “item” at all. Go figure.

This was a heady five years for me and William: after writing the two year 386BSD series “Porting Unix to the 386″ in Dr. Dobbs Journal and the source code of 386BSD 0.0, 0.1, and 1.0/2.0 , and the DDJ 386BSD Release 1.0 CDROM with all the writings and annotations in 1994, by 1995 we were putting the finishing touches on the first volume of Source Code Secrets while inventing role-based security, polymorphic protocols and new approaches in high speed networking (these articles actually led to a rethink in high speed networking that birthed InterProphet in 1997), and tinkering with CDROM filesystems on a lark. So forgive me for missing the import of this crucial event.

Musk has an axe to grind. Actually, he has several axes to grind. Anyone who knows the history of SpaceX has seen his axe. I assume he was going to bury it right in Stanford’s backside by grabbing any info they have about Twitter and its hapless former CEO Parag Agrawal, but I suppose he’s now quite happy being Chief Twit (not my first choice for a moniker — I think Big Tweeter would be better) and chopping up anything that moves. My guess is he’s now looking for some confirmation of those darn bots popping up everywhere, like heffalumps and woozles. Are they real? Or just a fever dream? Who knows?

But 1995 does stand out in retrospect. It can be considered officially the year anti-innovation became the watchword in investment even as amazing technologies like open source came to the fore. The opportunities for grift on the Internet (don’t forget that “no one knows you’re a whatever” meme) was so compelling and sexy that *any* attempt to disrupt this was taken as a threat. 

Limit the words. Limit the thought. The nastier, the better. No discourse. No remorse. Virality uber alles. (Haven’t we learned by now that virality leads to pandemics?) 

Like crack, the unfiltered quips of just about anybody and their bot was addictive — especially to journalists. Gotta admit, it’s a lot harder to track down and interview people in depth, or attend press conferences, or sort through press releases, or travel to obscure places, or actually cross-check your sources first — especially if you’re not getting paid well for it. Twitter made all that stuff superfluous. What mattered was being the first. “Covfefe”, yeah baby! Deep stuff. Quit twitter? Forget it. They’re permanently addicted, and Musk knows it.

While twitter has an outsized influence on journalists who write about twitter, who else uses twitter, really? Politicians? Extremists? The Real Housewives of Salt Lake City? The most lucrative demographic from a marketing ad sales standpoint is young people, not these people. But most of the kids have migrated to other more trendy sites, like tiktok or instagram. Twitter usage declined 10% among teens over the last seven years according to Pew Research. Heck, even Facebook is doing better than them, and from my perspective it’s been getting grayer along with my cohort.

The problem with a cynical viral play is that things like “making money” or “building a product” are unimportant. We’ve seen that time and again, but twitter was the worst of the worst for lacking even a modicum of humor and humility. Even when they had a chance to build something sustainable for a younger target audience, their tendency to kill anything that smacked of building a real business was stomped on. Virality and viciousness don’t require innovative talent and product. 

One example of their anti-innovation attitude was their acquisition of Vine, a trivial and frankly unthreatening six second video loop site. It was clear by the early 2000s that video was an interesting opportunity. Heck, I was pitching ExecProducer’s Massive Video Production strategy and online automated video production mid-2000s on Sand Hill Road. ExecProducer and CoolClip had much more sophisticated video server production than Vine, with a very different focus. So Vine should have been a no brainer to move twitter into a younger demographic, right? Uh, nope. After four miserable years, it was shut down. In the end, twitter acquired a potential rival — and killed it.

I wish the anti-innovation euphoria popular in the Silicon Valley investment scene would become tiresome. But it’s just too easy to make and lose money. Currently, venture capital investment is sitting on $500B of dry powder according to Pitchbook. Think of those numbers, folks. $500 BILLION DOLLARS, just sitting in accounts, waiting for the next six months flip unicorn. It boggles the mind.

Real innovation is risky. It takes time. We can’t flip a startup in six months doing real code, real hardware, real systems. It takes time to convince customers to try our stuff. It takes time to shake out the bugs. But it’s also a heck of a lot of fun and necessary.

Because sometimes the grift really does end. And you don’t want to be there when it does.

Fun Friday: Telescopes and Memories

I’ve been planning to write something for a while, but frankly, there hasn’t been anything really fun to write about.

Everyone is complaining about gas prices and inflation. Global trade is still bottlenecked and tangled in knots. There’s still a pandemic, folks, although you wouldn’t know it from the way people are dancing like it’s the last night before the End of the World.

On the business front, venture is busily grabbing any money they can to hoard while telling their portfolio companies to “tighten the belt”, mainly around the necks of their employees. Companies are eagerly complying by rescinding job offers and instituting layoffs. Folks are nervous as they crowd airports, hoping their flight isn’t one of the hundreds cancelled that day due to lack of flight staff. And the war in Ukraine waged by Russia in a fit of insanity continues to kill innocents and destabilize the entire EU.

Speaking of dead innocents, the US Supreme Court, destined to go down in history as depraved pandering sacks of shit, decided that guns everywhere makes for a stronger America. Their overturning of Roe v Wade, expected after the leaked draft admiring the people who burned innocent people as witches crawled out of the sewers, has been released and to no one’s surprise reduced women to that of beasts. Yes, it is not a Fun Friday for many people. Maybe it’s a Gun Friday. I’m sorry.

Roe v Wade was decided in 1973. I was twelve. It impacted my life and health for the better. Today it is officially overturned in a ruthless precedent-be-damned legal coup. I am sixty, past childbearing age. It cannot impact me directly. Yet I have daughters and young people I care about. I don’t want to see them hurt. Their happiness and livelihood and health matters to me. They should have the same rights to choice and freedom that I had. They may not know how much it matters yet. But they will. I am sure of that.

I spent the morning cleaning one of William’s prototype telescope designs for display in the office. It’s an unusually compact and minimalist design. As I cleaned the mirror and cover plate, I found a cricket living in the focuser. I watched it hop off the picnic table and out of sight, grabbed the telescope, and took it to the office.

It now sits amongst the many creative works William and I did together. Our reliquary. 375 computers. InterProphet low-latency networking boards. 386BSD articles and books and CDROM. An unpopulated six layer 375 motherboard.

In other parts of the office, an EtherSAN prototype unit box, a 386BSD CDROM with the heftiest liner notes ever made, 386 computers of various vintages used for 386BSD, and bins 386BSD and 375 disk drives, boards, and cables. Some complete and some mid-project, designs waiting for a hand to finish the work.

It is a reminder that things are never finished — they are only left in a state of usability for a time. Once that time passes, one either has to toss it away or begin again. I choose both. To toss some things away and to begin again on other things.

Young people also have a choice. They can fight for their freedoms — and they can toss them away. I hope they choose wisely.

Fun Friday: AI Technology Investments, Failed Startups, 386BSD and the Open Source Lifestyle and Other Oddities of 2020

First, William Jolitz and I did a comprehensive article entitled Moving Forward in 2020: Technology Investment in ML, AI, and Big Data for Cutter Business Journal (April 2020 – paid subscription). Given the pandemic and upheaval in global economies, this advice is even more pertinent today. 

Instead of moving from technology to key customers with an abstracted total addressable market (TAM), we must instead quantify artificial intelligence (AI) and machine learning (ML) benefits where they specifically fit within business strategies across segment industries. By using axiomatic impacts, the fuzziness of how to incorporate AI, ML, and big data into an industry can be used as a check on traditional investment assumptions.

For additional information on this article, please see AI, ML, and Big Data: Functional Groups That Catch the Investor’s Eye (6 May 2020, Cutter Business Technology Advisor).

Techcrunch presented their loser brigade list of 2020 failed startups in December of 2020 – although a few more might have missed the list by days. Some of these investments were victims of “the right startup in the wrong time”. Others were “the wrong startup in the right time”. And some startups were just plain “the wrong startup – period”. 

We mourn the $2.45 billion dollars which vanished into the eager pockets of dreamers and fools (we’re looking at you, Quibi – the pig that swallowed $1.75B of investment and couldn’t get any customers) and feel deeply for the Limiteds who lost money in one of the biggest uptick years in the stock market.

Thirty years have passed since we launched open source operating systems with 386BSD. Open source as a concept has been around for over 40 years, as demonstrated by the amazing GNU GCC compiler done by RMS. But until the mid-1990’s, most software was still held under proprietary license – especially the operating system itself. The release of 386BSD spurred the creation of other progeny open source OS systems and a plethora of open source tools, applications and languages that are standard today. However, the “business” of open source is still much misunderstood, as Wired notes in The Few, the Tired, the Open Source Coders”. Some of the more precious gems excerpted:

But open source success, Thornton quickly found, has a dark side. He felt inundated. Countless people wrote him and Otto every week with bug reports, demands for new features, questions, praise. Thornton would finish his day job and then spend four or five hours every night frantically working on Bootstrap—managing queries, writing new code. “I couldn’t grab dinner with someone after work,” he says, because he felt like he’d be letting users down: I shouldn’t be out enjoying myself. I should be working on Bootstrap!

“The feeling that I had was guilt,” he says. He kept at it, and nine years later he and Otto are still heading up Bootstrap, along with a small group of core contributors. But the stress has been bad enough that he often thought of bailing.”…

…Why didn’t the barn-raising model pan out? As Eghbal notes, it’s partly that the random folks who pitch in make only very small contributions, like fixing a bug. Making and remaking code requires a lot of high-level synthesis—which, as it turns out, is hard to break into little pieces. It lives best in the heads of a small number of people.

Yet those poor top-level coders still need to respond to the smaller contributions (to say nothing of requests for help or reams of abuse). Their burdens, Eghbal realized, felt like those of YouTubers or Instagram influencers who feel overwhelmed by their ardent fan bases—but without the huge, ad-based remuneration.

Been there. Done that.

Not many Linux-come-latelies know this, but Linux was actually the second open-source Unix-based operating system for personal computers to be distributed over the Internet. The first was 386BSD, which was put together by an extraordinary couple named Bill and Lynne Jolitz. In a 1993 interview with Meta magazine, Linus Torvalds himself name-checked their O.S. “If 386BSD had been available when I started on Linux,” he said, “Linux would probably never have happened.”

Linus was able to benefit from our two year article series in Dr. Dobbs Journal (the premiere coding magazine of the day, now defunct in an age of github), which along with the how-to details of “Porting Unix to the 386” we also included source code in each article. That, coupled with Lions Commentary on Unix (NB – the old encumbered Edition 6 version, and not Berkeley Unix) allowed Linus to cudgel together Linux. We had no such issues, as we had access to both Berkeley Unix and a source code license from AT&T for our prior company, Symmetric Computer Systems, and hence knew what was encumbered and what was not (Lions was entirely proprietary). Putting together an OS is a group effort to the max. Making an open source OS requires fortitude and knowledge above and beyond that.

Jalopnik, one of my favorite sites, found the ultimate absurd Figure 1 patents with this little gem of an article: Toyota’s Robocars Will Wash Themselves Because We Can’t Be Trusted. Wow, they really knocked themselves out doing their Figure 1, didn’t they? Womp Womp.

And finally, for a serious and detailed discussion of how the pandemic impacted the medical diagnostic side, I recommend this from UCSF: We Thought it was just a Respiratory Virus. We were Wrong (Summer 2020). Looking back, it was just the beginning of wisdom.

Stay safe, everyone!

Intel Ouroboros: Pat Gelsinger Returns to Build the Future

Classical Ouroboros. Wikipedia.

Pat Gelsinger is a technologist’ technologist. He worked on the X386 and X486 processors. We referenced the book he and Crawford wrote Programming the 80386 for our Porting Unix to the 386” series in Dr. Dobbs Journal in the early 1990’s and the development of 386BSD. It was a seminal processor and work that helped launched the open source operating system movement.

Yet Pat didn’t stay to retire with laurels at Intel. After many years battling for Intel’s future, he left to head EMC and, later, VMWare. Now he’s been brought back to Intel as CEO effective 15 February 2021. Why?

In a nutshell, while Gelsinger was off dabbling in storage technologies and cloud services, Intel was burning through every single technology advantage people like Gelsinger had built. Now, Intel is facing a reckoning, and needs to build a future again.

And that future depends on people with technical and domain skill, like Pat Gelsinger. 

This was a bitter pill for Intel’s Board of Directors and executive team to swallow. But, as Baron Mordo said, “The bill comes due”

The roots of this squandering of the future was based not in technology, but in contempt of technologists. Risk-takers in both strategic and startup investment in the 1990’s and 2000’s saw the proliferation of new approaches as “chaotic”. 

InterProphet SiliconTCP board. 1998.

I sat in an office of a top tier VC firm on Sand Hill Road in the late 1990’s and listened to the “smart money” partner complain about how their investments in ATM were being disrupted by InterProphet’s SiliconTCP low latency chip — as if I owned the burgeoning TCP/IP technology and was personally damaging their investments with a few prototype boards, a handful of working FPGAs and some Verilog.

TCP/IP was present in the mid-1980’s in Berkeley Unix, and used in datacenters throughout academia and government. As Vint Cerf himself noted, it was a good enough solution to get packets from one point to another.

TCP/IP as an “ad hoc” technology was good enough to take out OSI, ISDN and ATM. I thought it was wiser to surf the tsunami instead of railing against it. That just bred resentment.

I sat in corporate offices in the 1990’s and 2000’s and heard complaints about how open source was overtaking proprietary software stacks, and it was ruining their projections and their business.

Berkeley Unix was a feeder of innovation from the early 1980’s. True, it was not a viable competitor to proprietary OS stacks until we launched 386BSD in the early to mid-1990s. From that open source stack, backed by Dr. Dobbs Journal, sprang a whole host of proprietary software industry competitors, including Linux.

Open source kernels like 386BSD and its many progeny would not have made inroads if there had not been a wealth of innovation already present to mine out by these groups — innovation that was neglected, minimized or attacked by established proprietary players. 

But up to the point we released 386BSD publicly, everyone underestimated us. It couldn’t be done. It wouldn’t be done. But it was done. I knew it could be done.

I sat in a room in the early 2000’s as a VP at Microsoft complained about how open source was a threat and how, looking right at me, they had gathered information on everyone involved and their families. As if developing the open source OS created some kind of ominous fifth column of open source software subverting their eternal rights to OS glory. It was…unpleasant. It was also incredibly horribly damaging personally.

I listened in the mid-2000’s as a VC “sympathetically” told us that we’d never get funding again after InterProphet. Not because we’d done anything wrong. We met our commitments. We built things. But because they didn’t want innovation and the risks that came with it. And their way to kill the message was to kill the messenger.

“The bill comes due”.

The resentment in the 1990s and 2000s towards new ideas and the creation of new products was intense. All they could see was damage to their five year technology plans and prior investments. The idea of hedging your bets was anathema, because that implied they couldn’t control the industry.

And mind you, it was about control. Control of technology. Control of innovation. Control of monetization. Control of creativity. Control of thought.

So here we are, in 2021. Intel squandered their future, slicing and dicing their monetization game. Intel’s “safe and sane” business relationship with Apple is now in pieces. In 2018 Apple maneuvered Intel into taking out Qualcomm as a competitor. In 2019 Apple acquired Intel’s smartphone modem tech and developed their own. In 2020 Apple introduced the M1 as a competitor to the high end X86 line. And that’s just one customer. The vultures are circling. Intel lost control.

Now Pat Gelsinger has agreed to come back. How will he pick up the pieces of a damaged company? I assume he’d only return if he had broad latitude in restructuring, hiring and firing. He’ll investigate interesting acquisition targets that offer a path forward for Intel. And he’ll look closely at how rivals like AMD under Dr. Lisa Su have done so well while Intel foundered.

Intel ouroboros. Pat is back at the beginning. It remains to be seen how he creates a future for Intel once again.

Yes Virginia, Neutrinos Do Have a Bounded Mass (Thanks to Big Data)

Getty Images.

Many years ago, Jim Gray was conducting a talk at Stanford I attended, whereby he outlined the challenges in processing the huge datasets accumulated in scientific fields like astronomy, cosmology and medicine.

In those days, the greatest concerns were: 1) cleaning the data sets and 2) transporting the data sets. The processing of these data sets, surprisingly, was of little concern. Data manipulation was processor-limited and modeling tools were few. Hence, success was dependent on the skill of the researchers to delve through the results for meaning.

Jim lived in a world of specialized expensive hardware platforms for stylized processing, painstaking manual cleaning of data, and elaborate databases to manipulate and store information. As such, large academic projects were beholden to the generosity of a few large corporations. This, to say the least, meant that any research project requiring large resources would likely languish.

In the decades since Jim first broached the huge data set problem (and twelve years after his passing), the open source disruption that started with operating systems (of which I was a part) and new languages spawned in turn the creation of data tools, processing technologies and methods that Jim, a corporate enterprise technologist, could not have imagined. Beginning with open source projects like Hadoop and Spark (originally from UC Berkeley, just like 386BSD), on demand databases and tools can provide (relatively speaking) economical and efficient capabilities. And one of the biggest of big data projects ever recently demonstrated that success.

Continue reading Yes Virginia, Neutrinos Do Have a Bounded Mass (Thanks to Big Data)

The Security Frustrations of Apple’s “Personal” Personal Computer: Device Access, Two Factor ID, and 386BSD Role-Based Security

Image: smsglobal.com

Recently, a FaceBook friend lamented that he could not access his icloud mail from a device bound to his wife’s icloud access. He also expressed frustration with the security mechanism Apple uses to control access to devices – in particular, two-factor authentication. His annoyance was honest and palpable, but the path to redemption unclear.

Tech people are often blind to the blockers that non-technical people face because we’re used to getting around the problem. Some of these blockers are poorly architected solutions. Others are poorly communicated solutions. All in all, the security frustrations of Apple’s “personal” personal computer are compelling, real and significant. And do merit discussion.

Continue reading The Security Frustrations of Apple’s “Personal” Personal Computer: Device Access, Two Factor ID, and 386BSD Role-Based Security

Intel’s X86 Decades-Old Referential Integrity Processor Flaw Fix will be “like kicking a dead whale down the beach”

Image: Jolitz

Brian, Brian, Brian. Really, do you have to lie to cover your ass? Variations on this “exploit” have been known since Intel derived the X86 architecture from Honeywell and didn’t bother to do the elaborate MMU fix that Multics used to elide it.

We are talking decades, sir. Decades. And it was covered by Intel patents as a feature. We all knew about it. Intel was proud of it.

Image: Jolitz, Porting Unix to the 386, Dr. Dobbs Journal January 1991

Heck, we even saw this flaw manifest in 386BSD testing, so we wrote our own virtual-to-physical memory mapping mechanism in software and wrote about it in Dr. Dobbs Journal in 1991.

You could have dealt with this a long time ago. But it was a hard problem, and you probably thought “Why bother? Nobody’s gonna care about referential integrity“. And it didn’t matter – until now.

Continue reading Intel’s X86 Decades-Old Referential Integrity Processor Flaw Fix will be “like kicking a dead whale down the beach”

SpaceX and Open Source – The Costs of Achieving Escape Velocity

The successful low-earth orbit of a Dragon capsule mock-up by the Falcon 9 rocket was a great achievement by SpaceX last week (June 4, 2010) and a harbinger of the new age of private space transport. As I watched their success, the excitement from the press and space enthusiasts, and the unexpectedly vindictive response from many inside NASA, I was reminded of the launch of 386BSD – and why those most able to understand your achievement often are the most parochial.

Space exploration is a family tradition for the Jolitz clan, starting with William L. Jolitz developing transponders and thin and thick films for many spacecraft at Ford Aerospace (some still transmitting telemetry long after his passing) to his son William’s work at NASA-Ames on an oscillating secondary mirror for the Kuiper Airborne Observatory as a high school intern, to his three grandchildren working at NASA in astrobiology (Rebecca Jolitz), orbital dynamics and space fatigue simulations (Ben Jolitz) and spacecraft logistics for science projects (Sarah Jolitz).

Ben and Rebecca Jolitz also had the opportunity to meet Elon Musk, founder of SpaceX, at the 2007 Mars Society Conference held at UCLA and were inspired by his vision and determination (it was at that conference that Ben decided he wanted to major in physics at UCLA). Though still in high school, they had recieved honorable mention in the Toshiba ExploraVision competition for their highly creative concept of a Mars Colonization Vehicle based on using an asteroid in a controlled bielliptic orbit as a transport vehicle to provide the “heavy lifting” of supplies and personnel between Earth and Mars. They were invited to present a more detailed talk on their project for the Independent Study track. The speakers at this conference were a great spur to their scientific enthusiasm.

So it was with great satisfaction that I watched SpaceX demonstrate that “rocket science” is no longer the province of great nations but instead will bring about a democratization of space – cargo and transport, experimentation and eventually mining and exploration.

This is no quick path, however. The struggle for open source software, beginning with Richard Stallman and his remarkable GCC compiler, Andy Tanenbaum’s Minix system, Lyon’s careful documentation of version 6 Unix, our Dr. Dobbs Journal’s article series on 386BSD Berkeley Unix and subsequent releases and Linus Torvald’s amazing synthesis of the prior Unix, 386BSD and Minix works to achieve Linux occurred during a enormous burst of creativity that actually totaled about five years (1989-1994). After this came the long process of usability design – driver support, GUI support, applications support, new scripting languages – which is still a work in progress after another decade. Big vision projects take a lot of time and are not for the timid.

It is no secret that NASA has been struggling for many years with a lack of purpose. Just like Unix in the mid-1980’s and Windows in the mid-1990’s, technology which is held too tightly to a single group or company or national agency tends to calcify. Innovation becomes too risky. Agendas and interest groups override design decisions which may theoretically impact their funding. It becomes easier to add to a design than subtract from it, resulting in an unwieldy project which never converges in form or function.

Eventually, more effort in put into maintaining the flaws than eliminating them. Bugs and unexpected interactions begin to dominate, resulting in more meetings, workshops and conferences. Tools to manage the side-effects and flaws of the project become the object of research, while the actual project suffocates as it becomes more and more obese.

The life cycle of an operating system, like the life cycle of a space exploration vehicle, encompasses a brief burst of risk-taking and innovation followed by a long series of “rational” decisions which add heft and gravitas, followed by bloat, loss of purpose and final collapse. But during the long period of bloat and dementia, the lack of satisfactory execution provides an opportunity for newer faster designs leveraging new technologies in other fields to pry into previously unobtainable market niches and slowly eat out the old markets. This happened with open source, and it is happening with space exploration.

The shuttle itself is over 35 years old and encompasses aging technology which can no longer be retrofit – and has been long scheduled for decommission. This schedule has been put off again and again for two reasons: 1) the US has refused to properly fund and schedule a replacement because the costs and commitment are very great and 2) the maintenance and rocket groups are based in key states dependent on continual funding. Politics as usual has been to fund existing projects when we are long overdue to redefine NASA’s mission and goals. And, as is often the case, in refusing to examine other options, we have been left with only one option – end the shuttle program and depend on other nations and consortiums for transport – Russia and the European Space Agency primarily.

The Bush-era Constellation program was in theory supposed to provide an alternative, but the results of this program were laughable – it became a symbol of a bloated self-referential insatiable rocket bureaucracy that couldn’t build a real rocket to get pizza, much less get to the moon. And there was the tragic side to this – so many Americans love to complain about their government by saying “If we could get a man to the moon, why can’t we do” whatever, when in reality America lost that ability over 25 years ago with the decommission of the Saturn 5 rockets. Since then, like many other government makework projects, “rocket science” has devolved to fantasy powerpoint presentations and one-off prototypes that might have been flown, except for the risk of failure.

So while the science side of NASA, with their unmanned probes and experiments and space telescopes, has continually advanced despite the occasional loss, the rocket side has cowered, fearful of failure yet addicted to the status quo of “no risk = no failures”. And this stance, while appearing to play it safe, has created more opportunities for the SpaceX’s of the world as space transport, satellite maintenance and other niche markets look for more effective and less expensive approaches.

Competition, we are always told, is good for America. After all, it was competition with the old Soviet Union that launched the space program – and the need to hire rocket scientists to get us up there. So in principle NASA’s rocket guys should be pleased with SpaceX – they can leverage SpaceX’s experience while encouraging their own demoralized workforce to become more innovative. Like open source, the knowledge that “it can be done” should provide both a relief to fear and a spur to greatness. It’s a win-win, right?

So why the malice and anger? Why did so many within the agency that could most benefit from this knowledge wish SpaceX ill? Why are they running down their achievement? Why aren’t they rising to the challenge? Aren’t they eager to break out of their repressive paradigms?

While envy and fear of change play a great role here, the loss of status is most pernicious. During the rise of open source, a new set of designers and developers began to set the pace for innovation. Many programmers frustrated in their work in industry found an outlet in open source. An avalanche of ideas – good, bad and indifferent – could no longer be repressed by groups controlling proprietary operating systems source. These groups – corporations, standards committees, technology “gurus” – derived much benefit from the old system. They were the leaders at conferences, the movers and shakers of agendas. More than even money, they had the power to elevate or destroy ideas and people on a whim. And believe me, I saw what happened when people didn’t “get with the program”. It wasn’t pretty.

When 386BSD was born, I was told by many in the hard-core Unix side that it would be “strangled in the cradle” – either by lawsuits (which of course, never happened) or by ridicule (which did occur, constantly). I didn’t believe it. I just couldn’t believe that the experts I knew in the biz would wish it ill when they had an opportunity to finally work with BSD without all the proprietary license rigamarole. For years I had heard people complain about all the agreements and licenses and restrictions and “If only it were unencumbered”. Now that they had their wish, wasn’t it great?

Boy, was I misled. What I saw as an opportunity, many other good talented people saw as a threat to their comfortable professional existence. I understand comfort, and I never wanted to make anyone unhappy. But in giving them what they had wished for, I did make them unhappy, because I also gave it to everybody else – and that was inconvenient. Well, I plead youthful enthusiasm here to misunderstanding their desires. But if given the chance, I’d do it again, because it was the right thing to do – even if I did it the “inconvenient way”.

So what were the claims? I was told nobody would use open source because it didn’t have a big company behind it – and we see today that was wrong. I was told that nobody would make money off of open source – and today we see many companies developing profitable businesses off of support and new design. I was told that nobody would use open source to innovate, and yet I use entirely new applications and languages that were not even thought of at the time Dr. Dobbs Journal launched the “Porting Unix to the 386” series in January of 1991. I was told that the only way to distribute software was by selling it on a disk, and that we were crazy to put it out on the Internet, and yet now this is the way even proprietary software is distributed. When I talked about Internet-based OS’s, I was literally laughed at by experts I respected – and it hurt – but now we see the beginnings of the “webOS”.

The ridicule did have real and lasting effects. The constant intimations by Unix groups of pending lawsuits that never arrived but always “loomed”, the personal strain caused by creating entire OS releases on a shoe-string budget funded mostly by writing articles and refinancing our house while raising three young children, the ever-escalating expectations of a consumer audience demanding a commercial OS with all the bells and whistles dissatisfied by traditional Berkeley Unix research releases (with the traditional demands of self-administration – in a “damned if you do, damned if you don’t” moment I actually insisted the Dr. Dobbs OS release installation and administration be automated by default, with the traditional installation process selectable if desired, and was then ridiculed for not making them do it the hard way – sigh) and finally, the relentless badmouthing of any new approaches in the kernel – the raison d’etre of Berkeley Unix but not, admittedly, of a commercial corporate proprietary system. The last of these was the hardest to bear, frankly – and I understood why many other designers, seeing this, fled to Linux. After all, the ridicule, badmouthing and blacklisting was a piece of what they had experienced in their companies, so why endure it in a supposedly “open source” project?

So like 386BSD, the NASA badmouths and their corporate masters could potentially destroy SpaceX. Yes, SpaceX is better funded than a two-person project like 386BSD – our original “Falcon 9” rocket was a 300-400 kbyte kernel plus some apps (386BSD Release 0.0) and 17 5,000-10,000 word articles plus code on how to do it yourself – but getting out of a gravity well of Earth, not to mention the psychological gravity well of believing you can do it (which seems to be more like Jupiter in terms of magnitude) is a heck of a lot harder. Ridicule, the inevitable technical setbacks SpaceX potentially faces, liability laments (ah, there’s that “lawsuits pending” stuff again), a steep learning curve, American impatience (doing new releases with some new innovative work in the kernel took us about 8-12 months – doing the next stage in rocket / capsule design will take longer) and media disillusion when the audience fades (no audience = no money) add to the burden.

But even if, somehow, SpaceX is marginalized, their accomplishment are *real* and will spur others to try. Linux was able to grow and thrive during this time precisely because it was *not* an American project – based in Finland, the canards thrown at 386BSD were deemed irrelevant to Linux. Linux was a safe haven to many serious programmers disillusioned with the threats, lies and distortions promolgated around Berkeley Unix precisely because it was an outsider, uninfluenced by other interests.

People of ill will can kill an innovative project for a while. But they can’t kill the idea on which that project is based. It may be delayed for a while. But somewhere, somehow, it will spur others on to try. SpaceX, like 386BSD, is only the beginning.