Light Peak certainly seems to have a lot of potential. It was a brilliant move for Intel to use the existing mini DisplayPort connector. It makes the port immediately useful, and quite capable. With existing, cheap adapters, you get HDMI (with 8 channel audio), DVI, full-size DisplayPort, and legacy VGA to connect any manner of display.
Better yet, you’ll be able to get data port adapters too. Intel and Apple say that Light Peak to eSATA, or USB, or FireWire dongles and hubs should be available. Apple, of course, is surely going to release a new line of Cinema Displays that support the port, and will serve as a full “docking station” replacement. But there will probably also be actual “docking hubs” too.
Plug in one cable to a nice “hub”, and you could easily have a cluster of USB ports, a couple eSATA, HDMI, and DVI ports. Support for the new standard has been announced by Promise, LaCie, Aja, Apogee, Avid, Blackmagic, and Universal Audio, among others. And why wouldn’t they? They can keep everything about their internal implementations essentially the same. Add a pair of Light Peak ports, the second for pass-through, and internally convert from Light Peak to USB3 or eSATA or whatever interface your device is already using. That doesn’t give you added speed, but it makes it compatible quickly and easily. Even if some device manufacturers don’t directly support Light Peak, adding a USB3 or eSATA adapter to support those types of devices should be quite simple in time. Alex Lindsay will finally be happy, I expect.
And that brings me to what I’m actually most interested in… DisplayPort can also be split with smart hubs and used to drive multiple displays. I’m not sure if Thunderbolt supports DP1.2, but if so things could get seriously interesting. As Ryan Smith from AnandTech explained for the launch of the AMD Radeon HD 68×0 series:
At the moment the feature AMD is touting the most with DP1.2 is its ability to drive multiple monitors from a single port, which relates directly to AMDâ€™s Eyefinity technology. DP1.2â€™s bandwidth upgrade means that it has more than enough bandwidth to drive even the largest consumer monitor; more specifically a single DP1.2 link has enough bandwidth to drive 2 2560 monitors or 4 1920 monitors at 60Hz. Furthermore because DisplayPort is a packet-based transmission medium, itâ€™s easy to expand its feature set since devices only need to know how to handle packets addressed to them.
Yeah. Eyefinity. Apple switched to AMD GPUs and also happens to make a big play using a DisplayPort capable connector? Coincidence?
So we get a hub:
The alternative method is to use a DP1.2 MST hub. A MST hub splits up the signal between client devices, and in spite of what the name â€œhubâ€ may imply a MST hub is actually a smart device â€“ itâ€™s closer to a USB hub in that itâ€™s actively processing signals than it is an Ethernet hub that blindly passes things along. The importance of this distinction is that the MST hub does away with the need to have a DP1.2 compliant monitor, as the hub is taking care of separating the display streams and communicating to the host via DP1.2. Furthermore MST hubs are compatible with adaptors, meaning DVI/VGA/HDMI ports can be created off of a MST hub by using the appropriate active adaptor. At the end of the day the MST hub is how AMD and other manufacturers are going to drive multiple displays from devices that donâ€™t have the space for multiple outputs.
Three or four 1080p monitors all hooked up to a Macbook Pro with a Sandy Bridge CPU (and a giant external super-fast RAID)? Yes please!
So, perhaps you could run EyeFinity via a hub which also has all the data ports we could want? With Light Peak running data as PCIe, you could even have a second AMD GPU in there, unfettered by the power and thermal envelope requirements in notebooks, in CrossFire (though admittedly somewhat bandwidth starved since the controller is internally connected via a 4 lane PCIe bus). Sounds fantastic!
The only downside is that Light Peak has to be integrated into the mainboard directly, and apparently it will be Apple-exclusive until “spring 2012″. Yeah. A whole year. I’d hoped we’d see this with the next “wave” of Sandy Bridge motherboards: Z68 boards and then Patsburg LGA 2011 boards. But no joy, it appears. I’m not sure if this is because Intel wasn’t ready to put it in their mass-market chipsets, because the chip isn’t ready (it is reportedly fairly large), or because they partnered with Apple and Apple required some exclusivity time. Who knows?
As far as Apple’s offering, I hope they included EyeFinity support for those AMD GPUs. AnandTech’s article about the MacBook Pro launch mentioned that they’d have more on Light Peak soon (they did, it is here). I can’t wait.
Oh yeah… The name is terrible. Thunderbolt, really?
UPDATE: The new AnandTech article is a bit ambiguous on DP1.2 support. It says:
Thunderbolt shares the same connectors and cabling with mini DisplayPort, however Thunderbolt cables have different, tighter design requirements to fully support Thunderbolt signaling. DisplayPort is an interesting choice since itâ€™s already one of the fastest (if not the fastest) desktop interfaces, topping out at 17.28 Gbps in DisplayPort 1.2 at lengths of under 3 meters.
Which seems good, but then he goes on to say (emphasis mine):
Back when it was Light Peak, the goal was to tunnel every protocol under the sun over a common fast link. Multiplex everything together over one protocol-agnostic link, and then you could drop relevant data for each peripheral at each device in the daisy chain. Up to 2 high-resolution DisplayPort 1.1a displays and 7 total devices can be daisy chained. Thunderbolt instead carries just two protocols – DisplayPort and PCI Express. Tunnel a PCIe lane over the link, and you can dump it out on a peripheral and use a local SATA, FireWire, USB, or Gigabit ethernet controller to do the heavy lifting. Essentially any PCI Express controller can be combined with the Thunderbolt controller to act like an adapter. If you want video from the GPU, a separate dedicated DisplayPort link will work as well.
The way I’m reading this quote is that the original spec for “Light Peak” supported 2 high-res displays via DP 1.1a (“back when it was Light Peak”). And then that “Thunderbolt” (the shipping spec) supports “full” DP1.2 (and PCI-Express)? But this isn’t entirely clear. The “Thunderport cable” supports (theoretically) up to 20Gbps of unidirectional data (10Gbps each way times two channels in the current implementation), which means that it could handle the full 17.28 Gbps of the DisplayPort 1.2 spec. However, that bandwidth needs to be shared with the PCIe bus data too, so it seems unlikely that you’d be able to use a single Thunderbolt port to do something like drive 4 1080p displays and simultaneously writing to a high-bandwidth external Thunderbolt RAID box.
Of course, even if not, running 2 1080p displays and using a RAID box should be possible, and that’ is pretty darn cool.read more
I hesitated to post this because it seems to be fluctuating a bit, but I did want to report that I’ve consistently been getting 3G in downtown Bar Harbor, Maine over the past week (or perhaps a bit longer). I’m still dropping back to EDGE once I connect to the new tower at The Jackson Laboratory (inside the buildings there and in the surrounding areas). I’m not sure what the holdup is on that tower, since I know for sure that the tower itself is 3G-capable.
Either way, it appears that 3G has finally extended into downtown Bar Harbor and it seems to be working well. Great news!read more
Good news Xoom fans! According to the folks over at Electronista, Verizon apparently has seen the light (or the threat of no sales, anyway) and has backed down. Their original plan was to require all customers who buy a Xoom tablet, whether contract-free at $800 or on-contract at $600, to pay for at least one month of data, plus $35 activation fee. The net effect of original plan was that “full price”, including fees, for the contract-free Xoom tablet was $855 (plus any applicable taxes, of course), or $126 more than the “competing” 32GB 3G iPad.
Verizon on Thursday began selling the Motorola Xoom and offered an olive branch to users. The company has responded to near-universal complaints over plan requirements and told Electronista it will no longer force those paying the off-contract $800 price to pay for at least a month of service to get Wi-Fi. The savings will slash the $55 minimum of ‘hidden’ fees for both the month of 3G and activation.
The $600 contract pricing by its nature still requires service and activation.
Why the sudden reversal? I suspect this picture explains it…
Either Verizon or Motorola didn’t expect the iPad until April. Either way, this is good news. I still think the price will need to come down, especially once we get the iPad 2 next week, but dropping the obscene forced plan sign-up is a good start.
Unfortunately, according to Mr. Topolsky at least, it also appears they also need to actually finish the device before it will really be ready. Also, it appears that they do still intend to charge the $35 activation fee any time you turn on data service on the Xoom, so you won’t be able to turn it on for a month here and there when you need it like the iPad, which is a serious problem. Word is not clear on this particular tidbit at all, though (of course, if it was good news, Verizon would be shouting it to the heavens).read more
I read today a great post at Business Insider (via Gruber, of course) called “The REAL Death Of The Music Industry.” The post was prompted by a chart that has been making the rounds on Facebook and whatnot illustrating the profit decline of the Music Industry in the digital age. I first saw it when Gruber linked to a Chart of the Day also on Business Insider.
In January, Bain & Company produced the following chart as part of their report on Publishing in the Digital Age (PDF):
Then on Tuesday, someone posted it on Flickr. Subsequently, Peter Kafka of Wall Street Journal’s MediaMemo noticed it and passed it along to Jay Yarow, who made it Business Insiderâ€™s Chart of the Day on Wednesday, citing Kafka and the Flickr post. On Thursday, the excellent John Gruber at Daring Fireball linked to it and between those two postings the chart garnered a fair bit of attention, including from the likes of apparent digital music expert Bob Lefsetz (â€œFirst in Music Analysisâ€). No one seems to have tracked it back to the original source nor noticed what happened to catch my eye straight away:
This chart sucks.
Michael DeGusta goes on to explain that there are a number of glaring issues with this chart, the two most significant of which are: that it is mislabeled and the data represents only US sales, not Global; and that the revenue numbers aren’t inflation adjusted. And so he made a new chart:
The new chart changes quite a few things. Of course, the precipitous climb of the CD doesn’t seem quite so anomalous and the pattern more regular. But in the end, DeGusta points out that the situation for the RIAA actually appears to be much worse:
Wrong: The music industry is down around 40% from its peak in 1999
Correct: The music industry is down 64% from its peak.
Wrong: At least the music industry is almost 4 times better off than in 1973.
Correct: The music industry is actually down 45% from where it was in 1973.
Then he goes on to show that this trend is tied to the dramatic decline in album sales. In that colored chart up there showing the different types of music “products”? For all of the illustrated types except for digital (and video) a huge portion of the sales were in full album sales. CDs got the best rates for full-album purchases of all. Remember CD singles? They were priced so that only collectors and people experimenting with a new artist bought them. For digital? The percentage of full-album sales is only a small fraction of digital music sales. Growing, but hardly substantial. Here’s the album sales only chart:
I think the music industry today looks at that chart today and they blame, by and large, Napster and Apple.
Mr. DeGusta continues:
Thatâ€™s just over 1 album per person per year now, and only 0.25 downloaded albums per year. Here Mr. Gruberâ€™s guess is more on target, though current numbers are still substantially below pre-CD numbers. In addition to piracy and the general lack of interest in buying albums vs singles (see below), itâ€™s also possible that consumers’ ability to convert CD to digital versus having to rebuy vinyl albums on CD accounts for some of the disparity as well.
But wait… It sure seems not just “possible”, but quite likely that this is having a pretty big impact!
It appears if you look at “The Right Chart” that the CD-based wealth producing empire shown in the original chart was less of an anomaly, and more of a good return on a refinement of a plan. More like a product cycle. In the 1960s and 1970s people, probably mostly baby boomers, bought records in ever increasing numbers. But these consumers eventually got enough of a back catalog built up and got old enough that they slowed down their purchases and fell into more of a “maintenance buying” pattern. Buying new releases, but at a slower pace because they already had plenty to listen to when they wanted (and people with kids have less time to devote to music listening). There was some conversion as the format switched to cassette tapes, but mostly people just continued their maintenance buying pattern for new releases. Tapes were better in some ways than vinyl, but not in all ways, and not dramatically enough better to get people to re-purchase their back catalog. Plus, dubbing from LP onto cassette wasn’t easy and wasn’t great quality, but it wasn’t that hard, and you mostly just wanted the tape for in the car or on your walkman for at the gym (or on the walk to school). Your kid (me) could probably figure out how to set it up for you.
And then came the CD format. It was so much more convenient in so many ways over cassette. Sure, the quality of the music was better, but I think the convenience of the format was actually more important in the long-run. It allowed you to access the tracks as tracks, like a record, but in a compact size that didn’t degrade over time from needle-on-vinyl friction. It worked. It was small. It worked well in the car (after a little while). It let you skip the crappy songs on that old Ah-Ha CD for the crowd pleasing retro gems. There was no rewinding. There were no tapes eaten by the crappy car deck (though there was the disappointment of deeply scratched discs). It wasn’t great for portable use, but after a while we got manageable devices, and it was pretty easy to dub them onto cheap blank cassettes in the meantime. The first influx of adopters wanted the quality, but a bunch of the rest wanted the convenience.
So, just as the next big generation of music buyers, my generation, the sons and daughters of the baby boomers, joined the music-buying consumer class, they managed to lump-on an upgrade cycle with the older folks who already had a big collection of LPs and cassette tapes. The baby boomers finally rebought a lot of that back catalog of vinyl records. And us new consumers in the big “Generation X” mob? We bought CDs too, and entertainment executives built music empires.
But then something happened. That first dip is probably right about when the “upgrade CD sales” market hit saturation. When all the previous music buying generations finished rebuying their Beatles and Led Zeppelin albums. Maybe some of that second peak was the baby boomers convincing their parents to finally upgrade their Perry Como and Frank Sinatra records, but most of it was “us”. It was me and my friends (and enemies) and the kids coming up a few years before and after me. And then what I honestly believe the music industry deeply believes happened is this:
Napster started operating in June 1999.
But, perhaps this chart shows something else. Perhaps more importantly in the long run, this decline was also timed just as me and all of my friends and siblings’ friends were starting to graduate from college and go out into the real world to find jobs and spouses and houses. Perhaps, there was going to be a decline eventually. And there was.
In fact, if you compare the curve of the decline to the previous example around 1979, saturation in the LP market, the crash of the CD revenue picture looks much less steep and more controlled than the LP decline up until about 2003. The change then is timed nearly perfectly with the dawn of the “winning” next-generation format: the “legitimate” digital music sale. In the music executive’s vision of the world, this hypothetical “new format” should have been able to step in and take over where the CD left off. Even if it couldn’t drive a dramatic upgrade cycle like the CD, it should have been able to stop the precipitous decline and return us to a maintenance cycle.
But that didn’t happen. Look at the width of the bands of colors on that chart. The yellow CD band is awful wide compared to the ones before it. It was time for a transition. It was actually long past time. They’d coasted on the CD wagon for far too long. They tried to bring us a bunch of new formats that promised convenient recording (but computers made CD burners cheap), smaller size (but CDs were already small enough), better quality (already good enough) and no one wanted them. What was the winning next generation format? It was not DAT or MiniDisc or SACD or DVD-Audio or any of the other next generation formats that the RIAA labels pushed. It was digital music, brought to us by “Napster” and “The Internet”. By a guy who wasn’t even a particularly good programmer in a bedroom.
Once legitimate digital music sales were finally available to the mass market, they actually started to grow fairly dramatically. But as DeGusta points out, they weren’t the same kinds of sales. Instead of the $17.99-19.99 tasty goodness for a new release of a hot artist that was the RIAA’s bread and butter, people were “sipping” one song here and there for $0.99 a track. The buying public had decided, seemingly all of the sudden, that they no longer needed a full album of 3 hits and 8 filler tracks. And I think if you asked most music industry executives (at least those in charge a few years ago) why this was happening, their answer would have been in two parts:
They managed to shut down Napster, but the bad guys kept coming up with more and more distributed and difficult to disrupt systems.
They scrambled because none of the “next gen” formats they’d been pushing were anything like this new Internet-based distribution model. It was obvious at this point from looking at the declines that the CD had overstayed it’s welcome. The RIAA was suddenly desperate to give people a viable legitimate alternative to Napster for buying digital-only music, but obsessed over stopping the piracy losses and regaining some control over the distribution of music. And they had absolutely nothing prepared for this day. No arrow in the quiver to pull out and revolutionize their distribution system overnight. Simultaneously, many people in the industry also hated the very idea of digital only music because it was killing the golden goose, in their opinion, the CD. They had been trying to sell quality, but consumers were stubbornly choosing cheap, simple, and convenient instead!
And so even if they managed to pull a rabbit out and build a legitimate digital music distribution system that they could control with an iron fist (kicking and screaming), and to scare the mainstream youngsters off of those pirate networks, the experience with Napster had taught these consumers that music should be cheap, and acquired track-by-track in a giant smorgasbord, not collected into neatly packaged bundles sold in cellophane wrappers. The consumers who learned this best, of course, were their most important customers.
And then came Apple. They made it easy to manage and play these digital files and carry them around with you, but they also made it way too easy to rip all of your old CDs and put them on your iPod. Apple made the iPod cool and hip and targeted it directly to your exact demographic. These kids and young adults were all Internet connected and computer savvy (they were the ones who hooked up the record player to the tape deck, after all). They could certainly figure out how to rip a CD in iTunes, and they could help their parents do it just like they’d done so many years before. The quality wasn’t perfect, but it was “good enough” that most people didn’t care (outside of the people who had already bought SACD players, anyway). So there was no easy way to get your customers to re-re-buy their music again like they’d done before. No more driving the magic re-buy upgrade cycle. In fact, if the labels couldn’t come up with a way to build-in obsolescence with some crappy DRM scheme, there was never going to be another magic re-purchasing upgrade cycle like the CD, ever. From the executive’s perspective, Apple built a closed music store ecosystem where only Apple controlled the pricing for your products, and they were making all of the money selling their iPods off of the backs of your products. And just as you thought you might have some semblance of a savior in the insular and closed cellphone market (where you could sell the songs at outrageously higher prices), Apple released the iPhone.
And here we are. No one buys albums. Songs are a commodity. Apple controls the profits and the lion’s share of the distribution channel. The RIAA was even largely successful in beating back the piracy dragon (many pirates do still exist, but most “regular consumers” buy their music fair and square). They just aren’t buying them the same way, and it doesn’t look like there’s any way to go back again. The RIAA is hopelessly stuck.
So how did all this happen?
The labels (and the consumer electronics industry) clearly coasted on the back of the CD format for way too long, and didn’t see the digital age barreling at them at a hundred miles per hour, that much is obvious. But why didn’t they see it? I think the answer to that question is directly relevant to the situation for the Motion Picture and Television industries, and they’d better watch out. That Internet train is still a-rolling.
They didn’t see it coming because they didn’t understand that the user experience is king, and always had been. The reason you improved the raw quality was to improve the user experience, but that was not the only or even most important way to improve the overall experience.
Early on, most people’s MP3 files were audibly technically inferior than the quality on the CD. I had myself a large collection of 128kbps CBR MP3 files on a hard drive. It wasn’t great quality, but hard drives were measured in tens of gigabytes and, like the cassette before, it was “good enough quality” for many uses and much more convenient for carrying with you. Unfortunately, unlike the cassette, digital music also had none of the other convenience-based disadvantages of an analog tape format and it could be easily distributed digitally or created yourself. They were better than the “redbook” CD in almost every conceivable way that mattered to the consumer. I actually once had one of those old Rio Volt MP3 CD players, and it was great. I could burn MP3s onto CD-Rs, and carry 11 hours of music on one disc. I had a Pearl Jam disc and a nine inch nails disc and so on and so forth. I could carry my entire music collection in a little CD book, and since the CDs themselves were $0.50 blanks, I could throw them away if they got scratched and then just re-burn new copies from my hard drive.
Playing music from even this early device was not better quality than what came before it, but it was far more convenient for traveling with a CD player and a giant book of CDs (which were time consuming to burn and big enough in bulk to be a pain). And picking and choosing what music to play on my computer at home became far more convenient than digging out and loading a probably-scratched CD into a sometimes-temperamental Aiwa bookshelf CD player system. Hard drive platter densities kept increasing and soon any person with $80 and the will to do it could buy a hard drive big enough to store any rational sized catalog of music. You carried your iPod around and used it in your car, and hooked up your computer or laptop to the stereo at home for music.
The music industry was completely unprepared because they didn’t see it coming. They didn’t see it even though there were a lot of clues in the previous LP-to-Cassette-to-CD transitions. The reasons most people switched from LP and Cassette to CD was for convenience, not for quality. If you already have a computer, and by 2003 you did, what could be more convenient than a system like the iTunes store? You hear a song you like, you click buy, it costs almost nothing. Less than a cup of coffee. You can manage those songs and pick out precisely the song you want from an easy-to-use interface, and you don’t need to deal with the cruft and filler-songs of the RIAA’s album-based distribution system. Sure, a few good bands still put out full albums that you buy occasionally, but these are relatively rare. Simultaneously the reduction in capital at the recording studios makes it less attractive for labels to put out ambitious major-album projects like the hyped “double CD” albums of the early and mid-nineties. So they start shoveling even more “crapware” at the consumers, which reinforces the customer’s perception that there is a “lot of junk” and encourages the hunt-and-peck style of music acquisition.
And the death spiral tightens. To paraphrase Trent Reznor, they were just watching it burn in its steady systematic decline.
I think the video side of the entertainment industry is hanging on a nearly identical precipice. The landscape looks eerily familiar. But more on that later.read more
DigiTimes has a story up with a rumor that the HP TouchPad, which was unveiled only with a nebulous “summer” ship date (a classic Palm-esque pre-announcement debacle), may actually go on sale in April:
First-tier notebook brand vendor Hewlett-Packard (HP) is set to start selling its WebOS 3.0-based tablet PC, TouchPad in April with shipments to start delivery by the end of March, according to sources from HP’s upstream component partners.
But this is, of course, DigiTimes. The article is essentially just that: A statement backed up by no quotes or listed sources. Engadget is sketpical, and so am I.
However, if this does turn out to be true, this is very good news for HPalm.read more