Raytheon Wins the 10 Million US Patent Sweepstakes!

Today the US Patent and Trademark Office issued its ten millionth patent! The extraordinary Patent No. US 10,000,000 entitled “Coherent Ladar Using Intra-Pixel Quadrature Detection” was assigned to Raytheon Company by inventor Joseph Marron of Manhattan Beach, California.

As the winner of this sweepstakes, Raytheon has been granted a lovely 20 year monopoly from the filing date (10 March 2015) for a new and unique invention that uses comparisons between a target and sample frequencies in a clocked processor to determine the phase difference for navigation. You can also, it notes, use it for holography assuming your target and you are both stationary, but that’s unlikely to happen unless you’re driving a Chevy Malibu.

As a token of esteem, Raytheon has been provided this lovely new patent cover page to swaddle their new baby patent Figures and Claims. We have no doubt Raytheon’s patent counsel shall commence to commit it to the deep, to be turned into corruption, looking for the resurrection of the body when the sea shall give up her dead.

I congratulate Raytheon for winning the 10 Million US Patent Sweepstakes, beating out ever-industrious rivals IBM, Samsung, Canon, Qualcomm, Toshiba, Sony, LG, Intel, Microsoft and most particularly, Google, which has missed out on yet another self-driving navigation patent.

The next milestone sweepstakes, the 20 Million US Patent Sweepstakes, should be starting right about…now. Inventors: Start your engines.

2018 Tech IPOs: Is Enterprise a Game-Changer?

Jon Swartz’s recent piece in Barrons asks “Is This the Year Tech IPOs Stage a Comeback?” Prior year IPOs did not meet expectations, with consumer companies like Snap and Blue Apron the poster children for a miserable performance.

But the speculation among the smart money is that 2018 tech IPOs will surge, and they’ll be driven by enterprise companies.

So, is enterprise the game changer for tech IPOs in 2018?

My answer: “Yes” and “No”. Does that help?

Enterprise companies differ from consumer companies in that one can validate revenue streams. It’s a no-brainer, really. So that is comforting to investors looking for solid growth.

But the downsides to enterprise companies is that the very revenue streams that make them look juicy are also very vulnerable to disruption from upstarts offering a better deal.

The reason the 2017 consumer tech IPO market was a miserable failure from an investment perspective is that everyone fixated on hyping a big name, Snap or Blue Apron. The more they repeated it, the more they assumed consumers would flock to use it, justifying the IPO.

That didn’t happen, because, unlike FaceBook which did an extensive global lockup of the market, companies like Snap didn’t bother to do that hard footwork. But it was necessary to continued success after the spinup to IPO. They got lazy.

Now we’re in 2018, and extreme views change extremely. Consumer plays are anathema. Enterprise is the ticket. Getting some traction for revenue in an enterprise company is possible and desirable.

But the problem for enterprise companies is understanding the quality, consistency, and dependability of revenue as new guys offer better deals.

In particular, I wonder about Dropbox post-IPO. Is their enterprise story really that deep? Honestly, I can think of several different ways one can do a Dropbox one better, and I’m not even trying hard. Imagine if some investors got serious about this area, instead of playing low dog on the deal flow. Would it be an easy one for newbies to crowd in and offer enterprise customers a better deal? We shall see.

The Security Frustrations of Apple’s “Personal” Personal Computer: Device Access, Two Factor ID, and 386BSD Role-Based Security

Recently, a FaceBook friend lamented that he could not access his icloud mail from a device bound to his wife’s icloud access. He also expressed frustration with the security mechanism Apple uses to control access to devices – in particular, two-factor authentication. His annoyance was honest and palpable, but the path to redemption unclear.

Tech people are often blind to the blockers that non-technical people face because we’re used to getting around the problem. Some of these blockers are poorly architected solutions. Others are poorly communicated solutions. All in all, the security frustrations of Apple’s “personal” personal computer are compelling, real and significant. And do merit discussion.

One way tech folks get around Apple restrictions on email, for example, is to use multiple accounts. On an IOS device one can use multiple services and setup accounts for these services. For example, most don’t know that “notes” is actually a hidden email client via an imap fetch. Apple does use the appropriate protocols behind the curtain.

In my case, I don’t even use icloud mail. I use dedicated email accounts. I have access mediated by a mail server in our datacenter cloud which we personally administrate.  So all this frustration doesn’t impact me. I don’t see it in my daily life. I’m blind to the impact on others.

But what if I, like most people using Apple devices, want to access icloud mail from any Apple device? Icloud isn’t an ordinary heterogenous service. It is actually bonded to a set of devices. This is the philosophy behind Apple’s “personal” personal computer. The assumption is the customer has many devices tightly held and only used by that single customer. If that’s so, it naturally follows that those device will not easily permit access to a different icloud mail, because there is no need. The security won’t allow it. They expect you to setup another imap account, like Gmail, directly. They expect you to be a “techie”.

As an experiment in trying to understand this issue, I went to a mac that is not bound to an iphone I had in hand. By using “find my iphone”, I logged in as the apple ID and password. It then showed the location of that iphone sitting next to me. I then changed to the mail app within the browser, and it showed me the icloud mail. All this on a mac bonded to a different user. So I did get around the problem.

But it was non-intuitive. It was somewhat absurd. And it did reveal a security issue. Security by obscurity is a bug, not a feature.

I was unable to test this on other devices, such as an ipad, as I was pressed for time. But I’m pretty sure there are ways to get around this even on small devices. But really, seriously, is this sensible?

This entire conversation then segued into a discussion of two-factor identification. In theory, two factor identification is quite straightforward: Since everybody has a phone and some other device, if someone tries to access your account because they cracked your password and it’s not on one of your bonded devices, they send an email or a text to another device known as yours to confirm it’s OK. Simple, right?

Well, theory and practice are called that because thinking and living are really two different things. People live messy lives. Their phone may have been lost or forgotten or not charged up. They don’t know what device is the “mother may I” device.

The fundamental problem is that the nature of the security constraints of the Apple iphone concept requires it to be hermetically sealed. (In contrast, Android is a leaky sieve, and it is quite vulnerable.) This is why the battles between Apple and the government on access to personal iphones is so fraught, because it really is all or nothing.

This is not the only use of two factor identification. Two factor identification is required for a lot of services, such as FaceBook, and is not bound to a particular device walled garden. But the issue of making sure you have access to the primary device and “mother may I” device is still there. You must have all your devices, old and new, standing ready for the occasional incursion. And you must check and update all two-factor identification access points when you update devices. And this, my friends, is absurd.

I actually did a little work on security way back in the 1990’s, where William and I came up with the concept of role-based security as an adjunct to the usual password mechanisms. We even wrote up an article in Dr. Dobbs Journal about it, plus implemented it in 386BSD Release 1.0.

What were the gotchas? Security people didn’t like it because they were obsessed with crypto as a savior – which of course it wasn’t. The IT guys weren’t enamored because they liked administrating passwords and didn’t think it was a problem to change the password all the time, even though people don’t do that because they forget it, or they put it on a post-it on their monitor or write it in their wallet, and use “password” and “12345” as their password.

Two factor authentication is just doing a “mother may I” to a separate device because we have lots more devices, but fails if the device being asked might not be available, hence blocking the user from work. It may not be great, but at this point, it’s really all we have.

Apple Store “Bait and Switch” IPhone Battery Gambit: Apple Giveth and Taketh Away

Beware the Apple Store “bait and switch” iPhone battery gambit. We faced this yesterday in Los Gatos, CA where they tried to claim a working iPhone 6s with a good screen / original owner was not eligible for their $29 battery replacement at the appointment because it had a slight bow in the frame.

Now, by this point everyone likely has some flaw in their old iPhone, whether it is a slightly dinged frame from being dropped to a minute crack or scratch under the frame. It’s normal wear and tear. And they likely didn’t have a problem replacing the battery before the discount was announced and replacements were more costly and infrequent. But now, it’s an issue.

They did offer to sell an iPhone 6s for close to $300! This is a terrible price. Don’t go for it. This is what they mean by bait and switch.

There’s a good reason why Apple doesn’t want to replace old batteries after their bungled attempt to intentionally slow down older iPhones with an OS update was discovered, but they don’t mind selling old inventory at a premium. Money.

According to Barclays’ analyst Mark Moskowitz, extending the life of old iPhones will impact Apple’s bottom line and stock price severely: “In our base case scenario, 10% of those 519M users take the $29 offer, and around 30% of them decide not to buy a new iPhone this year. This means around 16M iPhone sales could be at risk, creating ~4% downside to our current revenue estimate for C2018.”

I suppose we’re back to the maxim, “If it seems to good to be true, it is too good too be true“.

Consider your options carefully when they refuse to honor their agreement.

Intel’s X86 Decades-Old Referential Integrity Processor Flaw Fix will be “like kicking a dead whale down the beach”

Brian, Brian, Brian. Really, do you have to lie to cover your ass? Variations on this “exploit” have been known since Intel derived the X86 architecture from Honeywell and didn’t bother to do the elaborate MMU fix that Multics used to elide it.

We are talking decades, sir. Decades. And it was covered by Intel patents as a feature. We all knew about it. Intel was proud of it.

Heck, we even saw this flaw manifest in 386BSD testing, so we wrote our own virtual-to-physical memory mapping mechanism in software and wrote about it in Dr. Dobbs Journal in 1991.

You could have dealt with this a long time ago. But it was a hard problem, and you probably thought “Why bother? Nobody’s gonna care about referential integrity“. And it didn’t matter – until now.

Now a fix is going to be expensive. Why? Because all the OS patches in the world can’t compensate for a slow software path. We’re looking at 30% speed penalties, sir.

Now, we can probably and properly blame the OS side with their obsession with bloated kernels.

But you promised them if they trust your processors, you’ll compensate for their software bottlenecks and half-assed architectures. And they believed you.

So now you’ve got to fix it, Brian. Not deny it. Fix it. Google didn’t invent the problem. It’s been there in one form or another since the 8086 was a glimmer in Gordon Moore’s eye.

And now it’s going to cost Intel. How much is up to you.

Silicon Valley Bank and eVestment Hit a Nice Exit with Nasdaq

People often fixate on the Home Run Deals: the Googles, the FaceBooks and the like. A home run deal is a 50x-100x return on a deal where you get in 1) really early, 2) really cheaply, and 3) an astronomical valuation that you 4) stay with the entire ride without being diluted. It is the stuff of movie magic and business books. But it’s also extremely rare and risky.

FaceBook was able to manufacture a success at a critical time by using Russian money in place of an interim round. Plus they locked up all world markets before positioning their final public offering. This was done at the end of the process, not the beginning. They discovered you don’t wait for things to go up. You force it to go up by controlling everything. Yet it was still a white knuckle ride and there were a lot of very big mistakes that were costly – the pajama adventure at Sequoia was quite memorable and led to quite a bit of trouble for the lad. That’s why he needed the Russian money. And as one VC likes to say, “You don’t screw around with the Russians”.

So how do folks make money in the vast majority of Silicon Valley tech deals? What early investors look for is someone who knows how a business works. They have a strategy to be able to consistently grow a business taking advantage of their expertise and contacts. They have sufficient resources to fund that growth while widening their customer base so as to present a desirable acquisition. And they need more than one acquisition candidate who sees them achieving this steadily.

And a good example of a reasonable and profitable tech venture investment is the recently announced acquisition of eVestment by Nasdaq.

The company raised $19M from Silicon Valley Bank. Over the next six years they did seven acquisitions, allowing them to aggregate the value, obtain several key customer accounts, and present a compelling proposition for Nasdaq to acquire for $705M. Their investor was in it for the long-term as it takes time to create a credible business in the financial sector.

For those who think this was too long to wait for the money, if you took that $19M and put in an annual yield of 9% over six years and compound it, you’d end up with future value of money estimate of around $32M.

Their primary investor, Silicon Valley Bank, made 22 times the future value of that money invested. The rule of thumb is ten times for a smart investment. Bravo to eVestment and Silicon Valley Bank.

Is Google Just Another Uber Bro? Unraveling the Tangled Silicon Valley Tech Geek Myth

The most recent attack on women and minorities in Silicon Valley has arisen unexpectedly from Google. Mounted by an anonymous Google engineer as a “manifesto”, it presents no facts, regurgitates disproven theories on the “biology” of men and women and, most tellingly, blames diversity for upper management’s cancellation of underperforming products at Google.

Many people have already started to address the clear absurdities of this person’s claims, from both an internal and external perspective, along with myriad professional press musings too numerous to mention (try google).

Of special note, the key weakness in this memo is that women and minorities have nothing to do with the lament of Google “demoting the core business” and “killing beloved services”. This is odd, and leads to speculation that this “manifesto” is nothing more than a disguised attack on the streamlining decisions of Google CEO Sundar Pichai. In other words, this guy is trolling people with a red herring of “diversity”, and the real intent is to embarrass Google executives for cutting a “beloved” project. Move along folks – nothing to see here.

But the fact that many people see this rant as factual does merit some discussion. This means we have to dive into history a bit. I know most people have little patience for the past. But it does help to know how we got from here to there. It wasn’t random chance.

There are a lot of women who have worked on technology projects in SV over the years, but you wouldn’t know it because no one writes about it, so no one believes that it happened even though this is a young industry and most of us are still alive. That missing piece of the story leads to the notion that women have not had any involvement in any technology and it’s a man’s world. It’s an absurd notion.

Whenever one sees these attitudes one also sees history has been deconstructed to focus only on one person at the expense of others – unless earlier in the history of the field there were key women who could not be deconstructed, like physics has Curie and Meitner. Those who control the information – tech journalists, writers and amateur enthusiasts – have had an almost laser-focus on men. Why?

In practice, tech readers rarely notice the name of the author on the article, which is why it’s pretty easy to write about hard tech even if you’re a woman. But they do notice who is being interviewed, reviewed, or cited as an authority, and it’s usually a man because, as any editor or publisher will tell you, “That’s what the reader wants”. If this seems circular and under-justified, it is.

The second item is the current obsessive focus on low-level “pipeline” women in tech. While it’s important to get women into the system, it’s actually retaining them at the lead engineer, line manager and director level that matters. There is no focus on that.

Thirdly, it was not uncommon for women to be part of a Founding Team that was funded in the 1980’s. Startup teams were typically at least three people reflecting technology, operations and/or finance, and sales. Even if a woman was not an engineer, she would still be viewed as an equal Founding Member for her business, marketing or sales skills. This was also true of black men and women, especially in business and sales due to the their strong presence in old-line companies like IBM.

This Funding Trinity structure became less common as the Cult of the Geek became a meme in Hollywood as a follow-on to the Western antihero. The story was recast from a team of rather dull startup business equals making spreadsheets and chips and PCs to the lone tech-guy going against all odds to fight the System. It was amusing at the time, but it’s been done to death. How many men are going to write another Steve Jobs movie or opera? I’d rather see an opera about Marie and Pierre Curie. Now that was a scientific tour-de-force love story.

But it’s considered normal storytelling in journalism and entertainment to interview / romanticize / suck-up to men when anything serious is discussed to avoid alienating the reader aka men. Are only men readers, users and developers of tech? No, they aren’t. This unquestioned assumption perpetuates the notion that women don’t work in the field, aren’t interested in studying or reading about tech (or science, economics, politics …), and that men are the only instigators of creativity and change. It’s lazy writing, but it’s easier to meet Internet deadlines when you write by recipe rather than by the old-fashioned research / rewrite / review.

Coincident with this fascination of the lone geek, the tech people who rolled out of Berkeley and Stanford at that time found themselves in a rather unpleasant quid-pro-quo: to get a good reference, a student might have to spend considerable amounts of personal time on unpaid / low-paid tech projects. Since jobs were rather scarce (we went through several big recessions in the 1980s, kids), and a reference was really important to a decent job, there were lots of people willing to do this. It became a bit of a seller’s market. In a seller’s market, choices often become based on whim and comfort, and that’s exactly what happened. A like prefers like situation developed among key professors and their lowly student help to reduce management overhead and increase their collegial network. It’s human nature to seek familiarity and comfort when excellence is a commodity. This myth of “someone like me is easier to manage” prevails today.

There was also considerable selection bias in computer science and engineering majors in the 1980s. At Berkeley, there was so much demand for engineering one had to compete to enter as a freshman in the college, which precluded people who were unsure from entering the major. A woman who wanted to be an engineer had to not only know how to apply directly to the College of Engineering at Berkeley, but also have the confidence and will to be an engineer despite the high school tendency to channel women towards the “softer” majors (if encouraged to go to college at all).

The safety valve on learning programming was the few restrictions in the 1980s for non-major students taking courses in CS at Berkeley, and those could be easily waived by a Dean. Berkeley tightened this loophole in the 1990s due to budget cuts, essentially cutting out many people who the decade prior could still take CS courses while in other majors. This led to an EE/CS bottleneck. Stanford had a much smaller pipeline, as do most top-tier private schools. Berkeley was the big one for matriculating people in the field in SV, and its stranglehold on access had a profound impact for two decades.

This skinny-pipeline reduced-risk preference of “guys like me” also was the golden ticket to investment referrals. The hard tech innovation that flowed out of universities – from Berkeley Unix, to RISC, to databases, to languages – was a lucrative and exciting opportunity for people who resented the indentured servitude of academia. They left to fund startups based on these technologies. And the most skilled at these technologies were the same people who had been most willing and able to do work for nothing. Stanford, sensing an opportunity, actually refined the pipeline for investment, offering students access to alumni referrals and networks for a “piece of the action”, and reaped a windfall. Berkeley, in contrast, retreated further into academic narrowness, resenting the desertion of so many into the very industries they helped spawn.

The reason we are seeing discontent today has two key factors: 1) the ability to access excellent introductory and focused courses in programming at a cost-effective level is within the reach of many and 2) the value of a EE/CS degree has declined. The latter is a result of SV growing to encompass mature industries and verticals. Unlike twenty years ago, it isn’t particularly important for every programmer to know how to write a compiler or understand graph theory, and many excellent programmers are self-taught, strangely enough just like many of the early SV pioneers. Most programmers and engineers also work on extant projects, adding some code here, fixing a bug there, and rarely work on a new project or technology de novo. There is more demand for Stanford business school graduates to manage logistics and funding than Berkeley CS programmers to create new technologies.

In addition, the reliance on global access to talent has had an unexpected effect. The number of women from other countries with STEM degrees working in companies in SV is quite high. The majority of women I meet at women-in-tech events (and by this I mean hard tech since that is my field) is dominated by foreign researchers, programmers and engineers from India, China and former Eastern Bloc nations. There is far less stigma for a woman to go into a STEM field in these countries, and it shows in practice. These women are educated, ambitious and not afraid to speak out.

The American women I see at technology events are most commonly clustered in the data science area, and often possess advanced degrees in STEM fields. They are comfortable with data science because many STEM fields work with very large datasets and the tools, techniques and processes are the same when one is analyzing weather patterns or consumer patterns. There is also a reemergence of the value of biology, physics and mathematics degrees in biotech, aerospace, and fintech, respectively. In all cases, the calibre of talent is high and increasing.

The preference of companies like Google for obtuse whiteboard quizzes from upper-division CS classes over that of work, references and experience to validate “fit” reinforces a “CS-degree from top-10 university” bias that is obsolete in industry today. It also has the effect of favoring hiring of recent college graduates over those who have more experience.

Most of the tech pioneers – women and men – who actually did accomplish interesting projects / research / startups / technologies in the prior generation would be weeded out of the hiring pipeline today because while they had a heck of a lot of experience working on technology projects, they didn’t spend their time studying code quiz books. I have a Berkeley physics degree. While it’s a plus to people like Elon Musk (who also has a physics degree) in emerging industries of new space or electric vehicles, it is a minus at Google, FaceBook and other SV new old-guard companies. That is how their metrics and processes work, and they’re happy to keep it this way.

But are they really happy? Is this stasis good for their business? SV management has clearly not kept pace with the social changes in our industry, preferring nostalgia and a “that’s how it’s done” attitude to on-the-ground knowledge and change. This is the same pattern that emerged in the prior generation of old-guard companies of Xerox, Bell Labs and IBM, among others.

The hard truth is many successful SV companies are stuck in a midlife crisis where doing things the old way and fitting in is more important than challenging extant processes, technologies and business models. When this occurs, the time is ripe for a paradigm shift. This is now happening, and it’s making a lot of folks very uncomfortable. They lash out. They blame others. They want thing to go back to the way it was.

To sum up, the requirement of unpaid labor and selection bias by EE/CS professors on key projects at Berkeley and Stanford in the 1980s, the tightening of the pipeline due to budget cuts in the 1990s, reducing the ability of men and women not already declared in the major to “try” programming, the increased reliance of investment on innovative technology startups through this narrowed academic pipeline via referral, and the tech press fascination and support of male enthusiasts who reinforced a Cult of the Geek led us to what we see today – a peculiar devout belief that programming is a man’s job. And that belief is threatened by the sheer number of women in SV now clamoring for a seat at the table.

Sweeping aside all the vanity, programming at its core is working with words in a stylized manner to achieve a desired function. I’ve always found programming more akin to writing a sonnet in terms of the structure than prose. Fixing code is like writing a limerick. It’s not male. It’s not female. It’s just a tool, no more male or female than a pencil. We spend a great deal of time teaching kids in school to learn the tools of language, writing, mathematics and science. Programming is just another tool, with no special or endowed gendered significance.

I think I’ll go write a sonnet. It’s been a while, but I still know how.

Google Cloud OnBoard San Francisco: Buried Alive by PR

There are times when a seminar or conference or training session induces trepidation because the expectations are high. One questions whether it was worth the time to travel to the destination, wait to park in the wreck-a-lot, find the coffee urn empty, and then find a chair in the back where you can barely hear the speaker. All the while, slack messages are building up at home base. Is it worth it?

I’ve always found a reason to make the trip worthwhile – a small tidbit of knowledge, an off-the-cuff experience, an interesting speaker. Sometimes I run into an old colleague and we chat over lunch. Maybe even something *new*.

Then there was Google Cloud OnBoard San Francisco. This conference did not meet expectations. And given the stakes in the battle for the cloud between Amazon, IBM and Google, Google must excel. It did not.

Google advertised this conference as an all-day in-depth technical “training” session on the Google Cloud Platform (GCP). Anyone who has been to AWS conferences knows what that entails: a keynote on where the cloud is heading by an executive including *numbers* on pricing and trends, a set of overview talks on the technology, and then breakout sessions on specifics from hardware to apps so the attendee can focus on their specific expertise.

Google offered none of this.

Instead of an informative keynote by someone who matters on the trends and reasons for using Google Cloud, Google offered weak PR. Instead of analysis, Google offered bluster. Instead of technical expertise, Google offered mean jokes with a smattering of contempt for their audience.

And instead of “training”, Google offered a bait and switch “try and save” set of random slides presented by an obnoxious Henny Youngman wanna-be who would cut off the few good technical speakers with “Hey morons, even my wife can do this” remarks which were so annoying in their constant and disruptive repetition that the audience became more and more irritated and combative during the few times they were allowed to ask questions.

Of course, beyond insulting the men and women who just happen to be programmers, engineers and managers with the “my wife” spiel, the icing on the cake were the snarky responses to genuine audience questions comparing AWS issues to Google cloud, something any attendee would be hard-pressed to answer at the weekly engineering team meeting. “Yes, boss, I spent the entire day at the Hilton in SF, and all I got was this lousy t-shirt” is not a satisfactory answer for any company engaged in serious work in the cloud.

The irony of this all this hand-waving by Google’s inept training staff is, if you can actually glimpse behind the curtain, there is some damned fine technology and thought put into Google Cloud.

You wouldn’t know that to hear the folks pitching it. Except for two competent people, a younger man and an older woman, who squeezed in a few technical discussions before they got shut down by the “host” of the event, it resembled nothing more than a mandatory cheer assembly for the losing high school football team at “More Science High”.

Two pluses. One: Since the training conference was a bust, we worked through the Qwiklabs “free” training kit (in record time I might add) of six hands-on “labs” on “GCP Fundamentals: Core Infrastructure” (Getting Started, GAE and Datastore, GCS, GKE, GCE, BigQuery). These were clearly ripped from a Spring bootcamp on Google Cloud and were rather haphazard, but far more informative than the actual training session. Two: If we passed Go (all the labs), they promised us $200 Google Cloud dollars.

We shall see if the GCP bucks are worthwhile or as useful as monopoly money.

Amazon is a company that decided to monetize their own product for other enterprise customers. In Microsoft parlance, they “eat their own dog food”. This means if their customer suffers, they suffer. It also means as an early adopter, their technology is arcane and hybridized. But they understand their customer, are willing to “buy in” to hold the customer, and constantly advance their reach.

Google can afford to learn from Amazon’s mistakes and make a *clean* Cloud – efficient, effective and reliable – at a cost-competitive advantage. But their attitude towards the very people they need to woo away from Amazon, to put it mildly, stinks. They’re arrogant, abusive and vulgar – even to their own technology staff.

And that is worrisome to any engineer or manager betting the company on a system that works all the time, every time. If you can’t trust Google to be serious about where you store and access your critical data, why invest the time and money in moving to their platform?

My compliments to the chef on the roast beef sandwich at the Hilton. It was definitely the high point of the day. The hotel staff were also quite pleasant. A hospitality business understands that the experience matters to a customer. A bad experience, and they may leave a bad review and never return.

Get your act together, Google.

Hello world!

After a non-brief hiatus where health matters intersected with work matters, I’m back to writing about technology, policy, people and innovation in Silicon Valley.

I’ve always lived in Silicon Valley. I was born in Fremont, got my physics degree at UC Berkeley, and have worked at and co-founded several tech companies here. I road my bike through orchards and fields now filled with homes and shops. I drove 2 lane roads now turned into always busy expressways. I went to school with people who have gone on to successful careers, even reshaping industries… and some who are no longer alive.

I’ve lived in Los Gatos for the last two decades. It’s a nice town (really and literally, the Town of Los Gatos), with a splendid library, just out-of-the-way enough to participate in Silicon Valley without enduring too much of the transitory madness of chimerical tech trends.

It’s an easy drive to all those places that matter, although those places have changed too. What was hot is soon not. Money changes hands, or vanishes into pockets. And the must-hear pitch is as quickly forgotten as yesterday’s weather.

There are patterns and anti-patterns to Silicon Valley, this Valley of Heart’s Delight. But now the heart is made of silicon and transistors and zeros and ones. It beats in picoseconds through cores of processors and devices. Thoughts and dreams and desires, both subtle and base, are accessible with a touch.

A. C. Clarke said “Any sufficiently advanced technology is indistinguishable from magic”. Perhaps this is why the gap between science and public policy, education and superstition has grown in the United States. This is a threat to Silicon Valley innovation and national security.

As the complexity of technologic and scientific innovation increases, those who “productize” and “monetize” innovation have successfully hidden much of the actual scut work of design, development and manufacturing from prying eyes and in some cases, regulatory fingers.

Corporate HR blacklists to protect Silicon Valley monopolies, creations of fictitious businesses and paper transfers of technology IPR invented here to bogus tax havens, and the hands-off creation and supervision of manufacturing facilities in poor countries with the inevitable stories of abuse and exploitation all paint a less than flattering picture of a “maturing industry” disinterested in innovation.

But where does one go when the “renovation, not innovation” mantra becomes tiresome, and magic doesn’t cut it?

Complex products made “simple” are inherently deceptive in both design and intent, but not because it makes technology accessible. Rather, the cult of “simple” design obscures technology as magic. And once technology becomes magic in people’s eyes, the value for the foundational knowledge and experience underpinning innovation is lost.

So we come full circle.

Can we “Tawk”?

Phil Bronstein today asked the unmusical question “What Tech Buzzwords Make You Go, “Huh?”. He brings up terms like “interstitial” (like, look it up buddy, it’s in the dictionary) and “open source” (if you don’t know this one by now, you’re doomed).

But what if the technical term is, to put it delicately, eff’d up?

A story from the legendary editor of the late great Dr. Dobbs Journal, Jon Erickson, told to yours truly to illustrate: One of the cover stories was on Thompson AWK language and as editor he set the enthusiastic tone (yes, some folks get really excited at the thought of AWK) with “TAWKing with C++”. However, somebody wasn’t minding their p’s and q’s (when they actually did mind p’s and q’s). When the magazine cover came back for final review it said something slightly different – “Twaking with C++”.

I don’t know if meth-heads read DDJ, but Jon wasn’t too pleased. Reportedly everyone could hear it thrown across the room and wham into the door. Oops.

Later, as a joke, the staff put together a fake cover with another “twak” reference. This is why journalists are heavy drinkers and why editors have short tempers.