What’s in the Future for Digital TV?

Attended the panel discussion hosted by the Swiss Science & Techn. Office, Wallonia Initiative, last night in San Francisco at their office downtown. On the panel were: Thomas Gieselmann, General Partner, BV Capital; Christina Ku, Consumer Electronics Group, Intel Corp.; Bob O’Donnell,
Research Analyst, IDC; Bernard Rappaz, Editor in Chief
Multimedia at Swiss-French TV; and Moderator Bruno Giussani, Knight Journalism Fellow at Stanford University.

One of the most intriguing questions was that the Europeans completely believed that broadcast TV as we know it is dead – no growth, no future. It’s all Internet and cellular.

Something to think about.

California Connects GenYs with Digital Media

We’ve got some great news for students in California who want to incorporate digital media into their studies,
I just heard from Jeff Newman, who kindly reviewed my ACE2004 paper on massive video production and how it can be used to build multimedia community projects.

Jeff says: “As to the impact of such technology, California has recently enacted the Digital Arts Studio Partnership Demonstration Program Act, to make recommendations on a model curriculum and state standards for digital media arts provided to youths aged 13 to 18 years.”

“The inclusion of streaming video would enhance the effectiveness of this statewide effort. It would require the council to convene a meeting of specified entities to review and recommended by consortia associated with each partnership.”

It’s great to see California schools and government taking the lead on such a critical new technology that totally connects with the GenY’s. Thank you, Jeff.

Virtual Communities

Fred Turner, Professor at Stanford, spoke the other day at SCU on “Counterculture into Cyberculture: How the Whole Earth Catalog Brought Us “Virtual Community”. Basically a history talk of the WELL and the organizing power of the hippie movement through the “whole earth” commercial powerhouse of the time. I found it curiously amusing – kind of like watching your mom in a “Granny dress” or your dad with a beard strumming a guitar.

While I’m not quite the age of the “summer of love” crowd (I think I preferred collecting Breyer horses then), I have watched the evolution of these communities from a technology standpoint, and have seen both their strengths and weaknesses as they grew (and in some cases died). As history and anthropology are an avocation and since I’ve been involved in developing and growing relationships using technology, it is a serious topic. So I went and listened.

One of the clear as bells problems stemmed from the willful misunderstanding of what the technology of the time provided and how it could be used. The WELL provided a novel community experience all right, but it was basically too limited to be of great use to build the kind of movement envisioned by the “counterculture” – it was just too early, and easily supplanted by the Internet.

The evolution, technology, and mechanisms which would become the Internet were actually quite separate in design and execution, rose colored glasses of the counterculture notwithstanding.

I know a lot of folks (even Al Gore) would like to stake a claim to the Internet’s success, or as the syllabus stated “the network technology of the WELL helped translate the ideals of the American counterculture into key resources for understanding the social possibilities of digital networking in the 1990s.” But I’m afraid it just isn’t so – it evolved independently and with funding from some of those guys – the DOD comes to mind – that the counterculture tended to protest against.

I’ve never found the hippie movement to be very progressive in using technology, except for television. It’s understandable, given the paroxysms of the time. Just like the nostalgia for this period by these guys isn’t so great for women and people of color.

But we should get real here: the right has used the Internet far more effectively to convey its message until Howard Dean went against his own party’s anti-tech bias and proved the Internet could be beneficial to the left.

It took thirty years, a lot of hard work, a ton of research funds, real tech visionaries like Cerf and Kahn and Berners-Lee to make the Internet the real world wide web.

Not all the cute stores that sold wood stoves, guitars and granny dresses could make one TCP/IP connection or HTTP web page.

Forget Printers and Film – It’s Digital Cameras and Clips

NYTimes had an interesting article by Claudia Deutsch on how Eastman Kodak can survive in the digital world. Very nice comments – they’re right on the money. Wish Kodak would listen, but their management still isn’t known for listening.

However, Kodak and other digital camera manufacturers have great advantages they haven’t even tried to leverage yet. While everyone else talks of film (old cash cow), printers (they’ll always be beat out by better players here), and verticals (medical, archiving, old film conversion), the new market will be in something already on every high end digital camera – video clip capability ready-made for the Internet.

I especially liked Judy Hopelain’s remarks: “Kodak must do more to insert itself into the ways that people use digital photography. Why aren’t they offering something to let tweens and teens use images in instant messaging? Why aren’t they doing more with cellphone cameras?…But Kodak should rethink the decision to pursue printers and printing. What are they going to do that is unique and brand relevant against Hewlett-Packard and the other big boys? They’ll just dilute their brand and stretch their resources” 

According to Time magazine, there are people using this feature in v-logs. It’s a very small market, however, because the tools to produce the clips into an entertaining form which fits the parameters of Internet viewing are very difficult and tedious to use correctly, and require considerable expertise – anything less and you get a laughable out-of-synch amateur effort full of artifacts and lacking the glitz.

I’m so glad I’m with ExecProducer, since we’ve just completed trials with the University of California which took these raw unpolished clips, turned them instantly via email / web into Hollywood-style high-quality movies complete with the imprimateur of the university (branding), music, and technical excellence, ready for viewing on the Internet. All the director need do is “watch the rushes”. Filmography, metadata (important for enterprise), invitiations, content, security,… all done. It will all be described in a paper accepted by the SIGCHI Advances in Computer Entertainment (ACE2004) in Singapore.

The service uses any digital camera clip(s), corrects their imperfections, and leaves the customer feeling that they bought the right camera to make such cool movies. Important for a manufacturer who needs to get value out of such an expensive feature.

It’s a great time to be doing this – all the signs are right. And, you know, you can’t print out a video clip.

Speed Stunts

Of course, never assume what the PR office of a university releases makes any real sense, as this SLAC press release demonstrates.

Looks like a commonplace database search trick to throttle flow control in a faster than exp backoff by probing for the likely end-to-end flow rate at any time. The question is, is this a good enough “good enough” strategy?

Jim Gray, once again, was willing to provide me a bit of perspective on this.

Jim told me “That stunt does not allow packets to get lost. There is some real engineering to make transfers at that speed actually work. But that is proceeding in parallel with the stunts.” That makes me feel more confident of what I was reading. Jim’s read is sensible and balanced, unlike the PR guys in the licensing office of Stanford.

I took a completely different approach with ballistic protocol processing, optimizing at key portions of the network the “best” transfer rate at that RT instant of time – it’s a structural approach, really. I was uncomfortable setting an arbitrary good enough limit given the ever-changing nature of the Internet at any point in time. I found what appeared to be “good enough” was hard to prove good enough.

But of course, I trained in plasma physics, and every attempt in that area to bell the beast by setting arbitrary limits on containment has proven unsuccessful. So 40 years of research there has still left us with “good enough isn’t”.

So who do you think has the good enough solution? CalTech? SLAC as written up in this breathy news item? Or are they running after rainbows?

How Fast Can You Go?

I’ve been following the CalTech and CERN groups responsible for achieving what they claim is the “latest Land Speed Record” of 5.4 Gbps and a claimed throughput of 6.25 Gbps over an average period of 10 minutes, according to the announcement to the Internet 2 newslist on February 24th.

Of course, what does this mean? They claim that “best achieved throughput with Linux is ~5.8Gbps in point to point and 7.2Gbps in single to many configuration”. They claim they’re melting down the “hardware” at 6.6 Gbps. Is this true?

FastTCP and SSC – A Short Meditation

While we’re all oohing and ahhing over CalTech’s FastTCP bulk transfers and record busting using their new TCP congestion control – interesting paper (finally) by Jin/Wei/Low – contrast this with friendly rival Stanford’s protocol high-speed TCP that changes the fairness (I find it interesting and provides some new ideas). Are either likely to impact anyone’s use of the Internet in the next decade, anymore than studying cold fusion?

I’m struck by how all this “record busting” may be a mere sideshow in the scope of real Internet usage, especially given Microsoft Research’s own Jim Gray’s economic arguments against bulk transfers at Stanford a few months back.

Jim said that it is cheaper to send a disk drive via FedEx overnight than any of these contests could provide of benefit to ordinary users. Could the CalTech and Stanford work be too early given that hard reality? I leave that to CalTech and Stanford to battle out which is better a decade down the line. But what about what we can study now?

Maybe dealing with that long latency network issue that Beck etal finds makes storage jitter intractable in the first place is the real challenge of the decade.

Recently a few database experts were suggesting that end-to-end principle might be applied to databases. Beck, Clark, Jacobson, … don’t address this. The question “Are database commits end-to-end – do they satisfy the end-to-end principle?” such as that described in the simplest case (akin to a chaotic strange attractor in physics).

Another thing that came up was “When does latency and jitter combine in a chaotic way such that reliability is injured in database transactions?”

Doyle at CalTech speaks of fragility vs complexity, and uses a combination of control theory, dynamical systems, algebraic geometry and operator theory to connect problem fragility to computational complexity, such that “dual complexity implies primal fragility”, in an NP vs coNP way.

It could be that robust yet fragile (RYF) is effective in defining what’s necessary to prove a viable global storage system. EtherSAN approaches the problem by idealizing the simplest end-to-end mechanism, TCP, with fundamental remedies – not increased complexity. RYF would indicate that this radically improves this by removing primal fragility.

All this seems very similar to the old fusion sustained power burst we had in physics a decade ago. Kept everyone busy until the SSC debacle killed everything in the field. Plasma research is only now beginning to recover.

Let’s go back to fundamentals with Clark etal on end-to-end and simply considering Beck’s well-done arguments for small transactions per storage, cleaving to those goals only and not creating new ones. Reexamining definitions, and understanding them better, ala Bohr and mass, but not changing them.

The Perfect Eye

Well, just for fun we followed Autumn along as she entered and competed in the Synopsys Science and Technology Championship with her project The Perfect Eye and created a little video on the science fair experience and at Great America for the ceremonies.

Since the project had the interest of some people in the San Jose Astronomy Association, a little notice on a local astronomy list mentioned the vid – and then the traffic started. Within one minute of posting, people were clicking and watching – thousands of views in less than a week, many repeat complete views. I had to allocate more bandwidth.

I guess everyone loves a science fair.

Massive Video Production (MVP), Berkeley physics, and all that

Wow, after a year of work with my old department at Cal doing a case study of technical issues in massive video production (MVP) for physics alumni outreach, lots of late nights, and crossing my fingers, it all paid off. I got an acceptance to the ACM SIGCHI Advances in Computer Entertainment Conference (ACE2004) to present the results of our work in Singapore in June.

I’ve never been to Singapore before, so I’m really excited. As CTO of ExecProducer, I’m proud of what we achieved over the last year technically. As a Berkeley physics alumna, I’m proud to have done a project with my department. And as a writer, I’m absolutely thrilled that they liked it.

Robotics and the Next Generation

Went to the 2004 First Robotics Regional Competition in Silicon Valley, held at San Jose State University. And it was awesome to see all these kids running their “bots” through the paces. Got some great footage, even though Los Gatos High School’s robot broke midway through competition.

Seeing the excitement, the fun, and the high-tech hijinks reminded me of the days when we were putting together workstations on-the-fly in a Berkeley workstation spin-out called Symmetric Computer Systems. I haven’t seen this kind of serious fun for a long time in the computer biz.

Maybe we should all be building bots…