Apple's Lightning-to-HDMI Dongle Secretly Packed With ARM, Airplay 392
New submitter joelville writes "After noticing artifacts and a 1600 × 900 image in the output from Apple's new Lightning Digital AV Adapter, the Panic Blog sawed it open and found an ARM chip inside. They suspect that video bypasses the cable entirely and instead uses Airplay to stream three inches to make up for the Lightning connector's shortcomings."
Car analogy (Score:4, Funny)
Re:Car analogy (Score:5, Interesting)
It's like having a 300HP engine in your fancy new sportscar, but all it does is turn an electric generator that delivers 50HP to the electric drive motor.
Yet, they sell it to you as a 300HP sports car.
Re:Car analogy (Score:5, Funny)
...that delivers 50HP to the electric drive motor...
...using microwaves.
Re: Car analogy (Score:5, Insightful)
Airplay is a network streaming technology. The network can be wired or wireless.
Re: (Score:3, Funny)
Re:Car analogy (Score:5, Funny)
No, it's like 10,000 spoons when all you need is a knife.
Re:Car analogy (Score:5, Funny)
No, it's like 10,000 spoons when all you need is a knife.
Don't be silly. There is no spoon.
Re: Car analogy (Score:3)
I see you've played knifey-spoony before...
Re:Car analogy (Score:5, Funny)
The irony is that a song called "Isn't it Ironic?" is not about irony.
Betteridge's Law of song titles clearly applies there.
Re: (Score:3)
The irony is that a song called "Isn't it Ironic?" is not about irony.
Then it's about Goldy and Bronzy, then?
Re:Car analogy (Score:4, Funny)
The comedian Ed Byrne said it best: http://www.youtube.com/watch?v=nT1TVSTkAXg [youtube.com]
Re:Car analogy (Score:5, Interesting)
This is a silly analogy these days. There are modern automatic transmissions that are basically just automated clutch-equipped gearboxes rather than the standard torque-converter-automatic that saps power like crazy.
Those transmissions transmit no less power to the wheels than a manual transmission would. Not only that, but they can shift faster than 95% of people can shift a manual transmission, so unless you're a freaking NASCAR driver you're going to get better performance using one of these than on a standard manual tranny.
Also they often have paddle shifters or similar so if you want to shift manually you still can.
Re:Car analogy (Score:5, Funny)
How quickly you change gear makes absolutely no difference to performance. *When* you change gear is crucial, and no automatic gearbox can solve that problem.
So what you are saying, is that I can take 17 seconds to change a gear, but if I change it at just the right moment, ill lose no performance at all compared with somebody who changes gears in 1 second?
Sir, I am in awe of your logic.
Re:Car analogy (Score:5, Insightful)
How quickly you change gear makes absolutely no difference to performance. *When* you change gear is crucial, and no automatic gearbox can solve that problem.
So what you are saying, is that I can take 17 seconds to change a gear, but if I change it at just the right moment, ill lose no performance at all compared with somebody who changes gears in 1 second?
Sir, I am in awe of your logic.
You clearly dont drive a manual.
I can accelerate into a corner, put the clutch in, slow and gear down as I turn and use heel to toe whilst releasing the clutch as I come out of the turn for better acceleration. Admittedly, I do this at roundabouts more often than I should.
An Automatic gearbox has to wait until I start to accelerate to drop back down a gear. Automatic gearboxes are always reactionary, a good manual driver is proactive. There is no way, any current production auto can pre-empt what a driver is going to do. Maybe when we've invented enough AI but then again, if we have computers advanced enough to tell what a person will do with that regularity, a person will be sitting in the back sipping martini's whilst the car does all the work.
When I drive auto's, especially "sports" automatic transmissions, they always gear up when I put my foot down, then when they realise I've put my foot down they drop a gear and jump 2000 RPM. Not smooth at all.
What the GP should have said, is that when you drive a manual, you can be in gear before you need it, an Auto is always going to be in gear after you need it... then again if you drove a manual you'd know that (or be really, really crap at driving a manual, in either case my point stands).
Re:Car analogy (Score:5, Insightful)
Dual Clutch Transmission in a BMW 135i here. Long time manual driver prior to getting this car, and in fact my last car before it (BMW 545i) I went all the way to North Carolina to buy because it had a stick instead of an auto.
My car is really a truly clutchless manual that happens to have an automatic mode. When I start my car in the morning and shift it to "drive", the first thing I do is click the shifter into the gate to the left and then click up once. That puts the transmission in a pure manual mode that will hold a gear until you shift it. I could drive to work in first gear if I so desired and the transmission wouldn't upshift. Of course, I wouldn't be doing much good for my gas mileage either, but the point is that I could. There are paddles on the steering wheel, but I tend to use the shifter because it's natural for me to reach down there for gear changes anyway; I flip the shifter away from me to downshift, toward me to upshift. It's incredibly natural.
As for performance driving, I can anticipate the curves and downshift appropriately every time, far faster than I ever could with a stick and clutch. Sure, it's sequential in the same sense that my motorbike is; I can't go from 6th gear to 3rd, I have to click through all the intervening gears. However, the incredibly fast shifts mean that I can get from 7th (my top gear) all the way to 3rd in about the same amount of time as it would've taken me with a stick. I lose nothing.
And for those that say that shift times make no difference; I have dragged (on a track, thank you) two identical cars; one with the DCT and the other with a stick. We did two runs in our own cars and two in each others... the result was always that the DCT equipped car was a good 3/10 quicker consistently in the quarter. Part of that is final drive (the DCT has a different final drive that gets better acceleration at the cost of slightly worse gas mileage) but even calculating that in we figured the shift times were gaining 1/10 on the quarter. Not much, but still faster.
Having said all this, have you driven the most recent 8 speed autos coming out? I have driven a BMW 535i with the 8 speed and was incredibly impressed by that thing. Yes, it's a torque-converter automatic but the technology has come a long way. Modern automatics really don't lose anything to a stick unless you happen to be a professional race driver. Of course, that would have to have been a professional race driver before 1992 because almost all race cars today use sequential manuals or dual clutch transmissoins... the days of rowing your own gears on the racetrack have been over for decades.
Re:Car analogy (Score:5, Insightful)
Next time you're driving behind me let me know by flashing your lights. When we take off at the traffic light going up the hill I'll put my foot on the clutch and then take a tea break while we count the number of horses powering my wheels while my foot isn't on the accelerator.
Question, do you actually believe the garbage you wrote? Automatic gearboxes can be tuned for performance to shift at the ideal spot every time. Maybe not in your shitty sedan but there's a reason why many motorsports use automatic gearboxes.
Re:Car analogy (Score:4, Informative)
The only "motorsport" I can think of which favors automatic transmissions is drag racing, and those are specially built units. Slushboxes are no good for really any other type of racing.
Re: (Score:3)
Obviously I'm not talking about taking ten minutes to change gear, I'm talking about the difference between the 500ms it takes to change gear with a manual gearbox versus the 100ms it takes the very fastest dual-clutch systems to change gear.
If it's in the wrong gear at the wrong time, it doesn't matter how quickly it changes...
Re:Car analogy (Score:5, Informative)
I can do that without even needing more than one make.
GTI ($26k model), Beetle Turbo or diesel version ($24.5k), Sportwagen diesel station wagon ($27k), Jetta sedan diesel or hybrid ($24k), CC ($32k), Eos $34.7k).
So that's six cars with an automated manual under $35k, and I didn't even have to leave Volkswagen.
Re: (Score:3)
Yeah, the Dodge Dart has a DCT in the Rallye version. But don't bother; it's attached to the worst engine I have ever had the misfortune to drive behind and the stupidest programming of the DCT I've ever encountered.
Now, if they put the DCT behind the new 2.4L engine they're planning for later this year, and fire their current DCT programmers we might actually be talking.
Re: (Score:3)
Not to mention it doesn't know to downshift when you're going downhill to save your breaks.
Actually, yes, it does. My VW won't do it right away, but it'll do it soon enough.
Re:Car analogy (Score:4, Insightful)
Also, I can shift faster than an automated manual, and I'd bet most other experienced manual drivers can too.
No you propably can't. Even the consumer-grade gearboxes now used in VAG cars have change-time measured in a few hundreds of milliseconds (worst case scenario when shifting to gear the transmission is not prepared for), the best case scenario being the Ferraris with gear change time measured in tens of milliseconds. So I call this bullshit, even if you are capable of superhuman speed gear changes, most experienced manual drivers are not. (I'm driving a manual, but that is because it is cheaper, if I had the money my car would have double-clutch automatic gearbox.)
Re:Car analogy (Score:4, Insightful)
It's like having 1000hp hooked to an automatic transmission.
Anyone generating serious horsepower is using an automatic transmission.
The guys at Bugatti will sell you a 1,000* horsepower 7-speed manual transmission for $120,000.
The guys at Hughes Performance will sell you a 3~4,000 horsepower 2 or 3 speed automatic transmission for $8,000.
Really high horsepower cars don't even have transmissions, just a bunch of clutch plates that progressively engage until the tires are 1:1 with the engine.
*The Bugatti has to electronically limit the horsepower at low speeds or it would destroy their manual transmission.
Re:Car analogy (Score:4, Insightful)
It's having a car designed to carry four people, but we fitted it out to carry six, the actual maximum it can carry at this time. But we wanted eight.
And it's not sold as carrying eight either. Back to actuals, it's not sold as a Lightning to HDMI cable, but a Lightning to digital AV cable.
Re: (Score:3)
It is sold as doing 1080p video though, which it doesn't.
Re: (Score:3, Funny)
How about a NasCar analogy?
They put a restrictor plate so that the cars can't go so fast. Thus they can race in an unsuitable venue to keep the rabid fans happy.
Re:Car analogy (Score:5, Funny)
How about a NasCar fan analogy? You think they have brains, but when you open up their skulls you find tiny Leprechauns jacking off to chrome hubcap advertisements.
Re:Car analogy (Score:4, Insightful)
Re:Car analogy (Score:5, Funny)
. . . it's like opening the hood of your new car, and finding a team of miniature Steve Jobs' bike pedaling the drive train while chanting "Don’t let the noise of others’ opinions drown out your own inner voice," and blowing the smoke of hallucinogenic mushrooms out through the catalytic converter while burning their votes for the new Pope living in a Crystal palace in the sky over Apples new headquarters impounded at a dock in Amsterdam . . .
Who's been sleeping in my brain . . . ?
Smoking mushrooms? Talk about drug abuse.... (Score:5, Funny)
What a waste of psilocybin....
Re:Car analogy (Score:5, Insightful)
Re:Car analogy (Score:4, Insightful)
And from what we see here, it's markedly worse than the alternatives Apple shunned, but that were based on standards (MHL, USB3), because those would have prevented Apple from imposing drastic licensing conditions on accessory manufacturers.
Re:Car analogy (Score:5, Funny)
Re: (Score:3)
Actually, I always think of Thunderbolt and Lightfoot. [wikipedia.org] But that's just me.
Re:Car analogy (Score:4, Interesting)
The M5 does come from the factory with a 155mph speed limiter, actually.
Re: (Score:3)
Re:Car analogy (Score:4, Insightful)
Many German cars do, or rather did as the restriction is becoming less common. Similar to the situation with the Japanese manufacturers that agreed to artificially limit their engines' power, the larger German companies agreed together to limit their cars to 155mph. I don't know what the reasons behind this were, but it may have something to do with the tyres available at that time or general driver safety.
With a 155mph limit, every carmaker can claim to sell the fastest car.
Re: (Score:3)
I remember the previous generation of Audi's TT had this annoying habit of flipping over above 130 MPH. The solution was (a) add a spoiler and (b) limit the speed to 130 MPH.
Security? (Score:4, Interesting)
So I guess it may be possible to reprogram the ARM chip to maliciously invade the users computer.
Might it even be possible to turn the adapter into a minion of evil by just connecting it to your computer assuming you have the right software running?
So borrowing someones AV adapter can now be a security risk?
Re: (Score:2)
Might it even be possible to turn the adapter into a minion of evil by just connecting it to your computer assuming you have the right software running?
Perhaps, but it is more likely that the device could be programmed via a JTAG port.
Re:Security? (Score:5, Interesting)
It does appear, from what the speculation says, that the host device sends the SoC firmware when the adapter is plugged in. Hardly unusual: Propritary firmware blobs have been the curse of linux driver developers for years. RAM is cheaper than custom-masked ROM. If that is the case, then it may be possible to simply send a modified firmware (Unless Apple have done any sort of crypto-signing). The hacked firmware would have no way to communicate back and would be lost upon reset, so you'd need to solder in a tiny battery or ultracap too. Beyond that, though, there is plenty of room in that chip to save a few frames. Hack adaptor, lend to The Boss when he goes into the super-secret HR policy review board meeting, collect it back, extract presentation, get the inside word on who is about to lose their job and who is getting a fat bonus. It's a doable exploit in theory, though the level of difficulty involved - reverse engineering the adapter and the firmware enough to edit an evil version - that anyone capable of doing so probably has no need to. The type of exploit researchers might perfect purely to prove it can be done.
Re:Security? (Score:5, Interesting)
A while back someone noticed that their battery firmware updates were encrypted but the password was embedded in the updater executable in plaintext. You could replace the battery firmware, and then if you found a hole in the EFI firmware or OS gain control of the computer. I wonder if they have learned from this mistakes.
Re:Security? (Score:4, Insightful)
Wouldnt it be easier to do it on the device itself?
Why and how it would be done (hypothetical) (Score:5, Interesting)
Wireless wire? (Score:5, Interesting)
Doubtful. More likely that it's streaming encoded digital video via the cable itself, and the components on the connector just decode the stream.
Perhaps this is a slight step forward, as far as technology is concerned, but it's a big leap back, as far as consumers are concerned...
Re: (Score:2, Insightful)
Some one anlized the video its only 1600x900 not 1080P. That will probably come out later for you to buy.
But yea basically they left the parts out o the newer iCrap and then charge you for more for capabilities the older stuff had.
Re:Wireless wire? (Score:5, Informative)
Rather they charge more for less capabilities: The old device supported real, uncompressed video. The new adapter has MPEG artifacts and added latency.
Re:Wireless wire? (Score:5, Interesting)
Perhaps you just need a different TV.
Remember, HDMI is just a superset of DVI, which works generally works fine for a myriad of desktop computers.
My own Samsung A550 from a few years back does just fine with sync, and works very well with video games. Even with layers of potential latency bolted on (playing Super Mario Bros. on an emulator on a Wii outputting component video which is then turned back into digital video at the television and then scaled), it behaves just as well as I remember it with an NES hooked to a CRT.
For that matter, both of the DVI-connected monitors on my desk also show zero noticeable latency.
As to cables, the cheaper the better, in my experience (at normal lengths): I've had expensive HDMI cables with ferrite beads on them, and had no end of problems with them. I eventually emoved the ferrites (with a sharp knife and a hammer), and they've been working perfectly for years... The cheap freebie cables that come in the box with gear or from bottom of the barrel Ebay sales seem to all work fine.
I have seen some TVs lately that had real, unforgivable latency problems, and they all happened to have been made by Sharp. These needed audio delays added in the AVR to make a movie play correctly, and were essentially unusable as a computer monitor or for video games.
Whatever the case, blaming HDMI (which really cursed piece of DRM-encumbered shit) for the wiring or latency issues is a non-starter. You're pointing your finger in the wrong direction.
Re:Wireless wire? (Score:5, Informative)
After they made such a big deal of the new dock connector, turns out is is inferior to their competitors. Samsung's modified micro USB connector does uncompressed full 1080p HDMI. The cables are dirt cheap too.
Re:Wireless wire? (Score:5, Insightful)
The new dock connector is superior in exactly two ways:
1. Thinner.
2. You can put it in either way up... because the device has additional electronics to detect which way around the cable is and adapt accordingly.
The second of those is a triviality: It really doesn't matter hugely if you can put the connector in first time without looking. It saves the user only a few seconds at most. The first is the only reason for lightning. Consumer demand and Apple policy are towards thinner and thinner products, with Apple leading the charge: They introduced lightning for the same reason the Macbook Pro lost ethernet. The connector became the limitation on thinnness, so it had to go.
Re:Wireless wire? (Score:4, Interesting)
Re: (Score:3)
The dock connector is thicker though. To make a thinner iPhone, Apple had to ditch their dock connector. They needed a replacement, so they only had two options:
1. Use MicroUSB.
2. Come up with their own propritary connector.
They went with #2 for business reasons. Exactly what these reasons are is known only to Apple executives, though my theory is that they wish to maintain a clear seperation of accessories between 'iPhone' and 'everyone else' so they can better established the iPhone as a premium brand and
Re:Wireless wire? (Score:4, Informative)
The first is the only reason for lightning. Consumer demand and Apple policy are towards thinner and thinner products, with Apple leading the charge: They introduced lightning for the same reason the Macbook Pro lost ethernet. The connector became the limitation on thinnness, so it had to go.
If the connector became the limitation then Apple's engineers have failed. There's several phones that are thinner than the iPhone 5 on the market not only currently but also dating back to 2011 (Motorola RAZOR Droid which was a shit phone for other reasons), all of them had microUSB connectors.
The real reason is that Apple wouldn't be caught dead using an open common connector type that doesn't give them absolute control over aftermarket accessories. This is much like the bullshit of the nano SIM card that Apple claims was too big and thus limiting device size despite many smaller phones using larger SIMs. Again the reason is that Apple insisted on an edge connected SIM card where the rest of the industry had it mounted next to the battery behind the removable cover.
Re: (Score:3, Informative)
Never apply DRM to someone else's work (Score:5, Interesting)
Whoa. Are you saying this is applying HDCP to everything it plays?
That would be very interesting, since if I made a video of my own and played it through this device, the television would be descrambling a technological measure which limits access, without my authorization. That's circumvention. This device from Apple, would cause the manufacture and sale of all HDMI compliant TVs to become illegal.
Good engineering? (Score:2, Insightful)
Remember when Apple was known (at least by the general public) as being the company with simple, elegant engineering?
How the mighty have fallen. Really, needing a computerized cable is just silly.
Re:Good engineering? (Score:4, Informative)
Remember when Apple was known (at least by the general public) as being the company with simple, elegant engineering?
How the mighty have fallen. Really, needing a computerized cable is just silly.
The problem is likely that Lightening likely doesn't have enough pins to just pass through HDMI like the old connector.
Silly? Maybe, but all of Apple's competitors are doing something similar because micro USB also lacks sufficient pins to pass through HDMI. (http://en.wikipedia.org/wiki/Mobile_High-Definition_Link) Except they're shoveling half the chips into the device, which increases costs on that side.
Re:Good engineering? (Score:4, Insightful)
Re:Good engineering? (Score:5, Insightful)
Thing is, MHL sends uncompressed 1080p over a cheap, standardized cable. Apple's standard, evidently, does not. And like you said, it's worse than the old docking cable in this regard. Regression is extra silly.
Looking at most MHL cable prices from vendors, they're cheaper than Apple's adaptor, but not cheap.
And as I mentioned, MHL drives up device prices because it requires additional circuitry in the device. Standardized cable you say? Try plugging an MHL cable into a Nexus 7. Won't work? That's because the chips required for MHL were too expensive and they were left off the Nexus 7.
Shifting half the expense to the device and half the expense to the cable isn't cheaper, it's just moving costs.
Re: (Score:3)
That's only true if there's a 1:1 relationship between tablets and cables.
Re:Good engineering? (Score:5, Insightful)
10 seconds searching Amazon [amazon.co.uk] turned up an MHL cable for £3.50, extremely cheap. The Apple version [amazon.co.uk] is £37.
Standardized cable you say? Try plugging an MHL cable into a Nexus 7. Won't work? That's because the chips required for MHL were too expensive and they were left off the Nexus 7.
I'm not sure how that makes it non standard. Are you saying for something to be standardized every device must support it? That's crazy talk.
Re:Good engineering? (Score:4, Insightful)
You're being extremelly disingenuous:
1- MHL is cheap, there are plenty of $2 MHL cables. If you like paying for brands and stickers, that's your choice... they have nice ones with golden connectors and one-way flux optmizations, I'm told.
1b- MHL is cheap, the cost to implement it is nowhere near whatever Apple are doing with their fake video cable.
2- MHL is a standard. The fact that some chose not to have the feature does not change that. A bit like.. you know... you're PC not being an FM radio does not make FM radios un-standard...
3- are you trying to imply that MHL is as expensive as having a failed proprietary interface + **active** components to fake a high-def video link, but that just the cost are split differently ? I can assure you that Apple's "solution" is several times more expensive both to implement in the device, and for the cable. And wayyyyy worse in terms of quality.
Re: (Score:3)
Re:Good engineering? (Score:5, Interesting)
Really, needing a computerized cable is just silly.
Actually, it's a step forward and it's not the first technology to do this. The basic idea is, make the port a smart interconnect and let a smarter cable be more adaptive. That way a 4 meter cable can be tuned differently than a 2 meter cable and you can use the same port for a cheap copper cable or a long but expensive fiber cable. Regardless of how relatively expensive the cables are, replacing the computer is harder and adding new ports to mobile devices, even most laptops, simply doesn't happen. This makes a nice, future-proofed port for your laptop, phone, peripheral, etc. that will have real longevity.
Re: (Score:3)
That's true, as long as the connector and cable support the basic signaling bandwidth required for all it is sold for. If Apple are having the compress the high-rez signal to get it out or over the cable, then it's a step backward.
Re:Good engineering? (Score:4, Insightful)
If Apple are having the compress the high-rez signal to get it out or over the cable, then it's a step backward.
Agreed. I did not mean to imply this was a good technology (either the port or the adaptor), just that conceptually putting a chip in the cable seems like an excellent idea.
Re: (Score:3)
Not really, most modern GPUs can do HDMI encoding so there is no additional cost beyond perhaps an MHL capable USB controller, if you are not using a real HDMI port. HDMI was always supposed to be cheap to implement, otherwise it would never have taken off.
Switch inputs to the $99 Apple TV (Score:4, Insightful)
Sending video through Airplay is WAY easier than keeping cables around to hook up an iPad to a display
As opposed to keeping cables around to hook up a $99 AirPlay receiver to a display?
and having to know how to switch video inputs (still an unfathomable mystery to many)
If it's unfathomable to switch inputs to the iPad, it's just as unfathomable to switch inputs to the Apple TV.
I've fallen and am only outselling everyone else in the market by a huge margin!
Do you want me to go dig up the story about Nexus tablets outselling the iPad? I will if you want.
Google Nexus 7 tops iPad in Japan: Trend?--CNET (Score:4, Insightful)
"Based on a [December] survey of 2,400 consumer electronics stores in Japan, Google's Nexus 7 tablet had 44.4 percent of the market versus the iPad's 40.1 percent, according to Nikkei, Japan's largest business daily."[1]
[1] Brooke Crothers. "Google Nexus 7 tops iPad in Japan: Is this a trend?" [cnet.com] CNET, January 16, 2013.
So how did I fail?
Re:Good engineering? (Score:4, Insightful)
Err, never? Simple, elegant design maybe, but engineering?
Apple-style design requires a LOT of engineering.
The first Macintosh, for example, has a fully modern GUI in 128kb of RAM. That's kb. Very few engineers today could get anything useful to run in an eighth of a meg, and even fewer could include an entire OS, a GUI, AND still have room for an application. They made some pretty major compromises to get it done, but they did it.
More recently physical design has been more important for them. But that takes engineering, too. The Macbook Air doesn't look like it's been heavily engineered, but do you seriously think that a bunch of Art Majors figured out how to mass-produce a fully functional laptop that fits in a damn envelope with no engineering help?
I'm not saying Apple should be known as an engineering company. It's not a company that lives and dies by technical specs. But Apple's designs just don't work without some damn good engineers. Their work is very hard to see, largely because Apple wants their products to look seamless, but that doesn't mean their work never happened.
Disappointing for a new connector (Score:3, Informative)
With its limited pin count, it's not a surprise that the Lightning connector does not have the bandwidth to transfer uncompressed video. But it's disappointing for it to be so bad at compression, with the MPEG artifacts shown in the article, plus latency issues with encoding/decoding. On that point, the old connector was better, and micro-USB3 would have had enough bandwidth to avoid the issue completely.
Do you even know what "serial" means? (Score:5, Informative)
With its limited pin count, it's not a surprise that the Lightning connector does not have the bandwidth to transfer uncompressed video.
Good grief. How many pins, exactly, would you say are needed for a serial connection?
Now look at the end of any USB cable and the end of a Lightning connector. What is the pin count between the two?
micro-USB3 would have had enough bandwidth
Also look at how many pins are in a USB 3 connector (HINT: ITS THE SAME).
This issue has nothing to do with bandwidth from Lightning.
Re:Do you even know what "serial" means? (Score:5, Informative)
"Good grief. How many pins, exactly, would you say are needed for a serial connection?"
One, if you're operating an old telegraph. Eleven, if you're doing HDMI. Four twisted pairs for differential serial, plus three that are used for control information. Monitor resolution detection, that sort of thing.
http://www.hdmi.org/installers/insidehdmicable.aspx [hdmi.org]
Some devices appear to do it with less, but they are actually using MHL, not HDMI.
Re:Do you even know what "serial" means? (Score:4, Informative)
Nope. He is correct. Just one.
And a grounded wire at each end; sure. But there is no need to run that along the signal line. ;-)
- Jesper
Re: (Score:3)
This issue has nothing to do with bandwidth from Lightning.
Actually it does. The Lightning connector appears not to have the signal integrity necessary to support the serial bit rate needed for 1080p video. The electronics and physical design limit the bandwidth available.
The number of pins has nothing to do with it. Micro USB 2.0 connectors support 1080p video via MHI. I don't know why Apple were unable to match that relatively mature and very low cost technology.
Re: (Score:3)
Re:Disappointing for a new connector (Score:4, Interesting)
Samsung's modified micro USB connector does fully 1080p HDMI, as well as a variety of other stuff. Cables are dirt cheap andy for sync/charging any standard micro USB cable works.
This would appear to be a fairly epic failure for Apple because they are now stuck with either artefacts or changing to yet another new connector for all future products.
Re: (Score:3)
The MHL standard supports up to 1080p/60 high-definition (HD) video
http://en.wikipedia.org/wiki/Mobile_High-Definition_Link [wikipedia.org]
Demo video of an Xperia T connecting at 1080p24: http://www.youtube.com/watch?v=TDJvgvbaR-w [youtube.com]
Re: (Score:3)
Re: (Score:3)
With its limited pin count, it's not a surprise that the Lightning connector does not have the bandwidth to transfer uncompressed video.
I totally disagree. Coax and Ethernet get you plenty of bandwidth on fewer pins. When Apple announced this thing, I was delighted that they must have some kind of brilliant plan for using these very few pins in a flexible, high quality, eventually low-cost manner. If their plan for flexibility was just "send a system image over USB, then connect via USB to that thing once it boots" then I am surprised and disappointed.
Costs may come down as we approach computing ubiquity, but this puts a ceiling on quality
Re:Disappointing for a new connector (Score:5, Insightful)
I don't think you grasp the actual advantage to lightning. It has one HUGE advantage that no other cable would provide. It forces vendor lock-in while at the same time instanly obsoleting all previous Apple cables. It's a marketing dream! (nightmare for users, but since when has Apple ever cared about them?)
Of course it has a CPU in it. (Score:5, Informative)
Of course it has a CPU in it. Something has to do the protocol conversion.
It's not clear that Apple's AirPlay protocol [github.com], which has HTTP connections in both directions, is involved. But the pictures indicate compression artifacts. The original article doesn't go into enough detail to determine whether image compression (like JPEG) or motion compression (like MPEG) is being used. An MPEG compressor would introduce visible lag between the master and slave screens.
Re: (Score:3, Informative)
Although I don't have the means or desire to test it, it is far more likely that they decided most of what people would want to output via HDMI was H.264-encoded video. So they made an interface where H.264 was streamed over the lightning connector, and converted by this adapter to HDMI. Probably both sides use HDCP or similar protections.
The limitations Panic encountered are because the video support in the iPad mini can only h.264 encode the screen (for 'mirroring') at lower-than-1080p resolutions.
Re: (Score:3)
Most phones and tablets output HDMI directly via either a HDMI or USB/MHI port. Most modern graphics processors support HDMI output. It is very surprising that Apple need this extra processor when most ARM based devices support HDMI natively.
But the real question is what else can it do. (Score:5, Funny)
I think we are missing the point a little here, They released a tiny computer for 50 bucks, now we just need a port of cyanogen for it.
That is certainly one way to look at it (Score:5, Insightful)
Fact: Apple has an ARM processor in the cable. It is fair to assume the video is processed by the chip in the cable.
The rest of the facts in this case are just speculation:
* Is design a 'limitation', or a design choice?
* Is the 1600x900 output seen by Panic a Panic problem or an Apple one? Is it a bug or a limitation of the hardware? File a bug and find out
* Is the connector providing Airplay over the 6cm cable? Pure speculation. Sounds plausible, even clever, but that is just a guess.
It seems to me that there is certainly an interesting story in this adapter, but I don't think we know what that story is yet.
Poster/Article is way off ... (Score:5, Insightful)
The electronics involved have nothing to do with AirPlay, and this is not "news" in any way. Sorry to ruin excitement and conspiracy theories... :-)
I am willing to bet serious money that all these chips do is decode whatever proprietary protocol Apple uses for transmitting video over the Lightning [wikipedia.org] port to a standard HDCP [wikipedia.org] protected HDMI [wikipedia.org] signal. This is needed because the Lightning port has no other way of transmitting the video - and this has been clear from the day Apple revealed the Lightning port to the world. It is really just a high-speed 8-pin serial connector. Nothing else.
In addition the chips probably try to introduce a classic vendor lock-in factor, making it hard for 3rd party vendors to provide similar cables and accessories for the Lightning port without paying royalties to Apple (read: legal tech-extortion).
Also, the scaling-problems mentioned are without a doubt due to the screen-mirror scheme involved. If they streamed an actual 1080p video file directly, the result would likely be very different.
The speculation in the article is so far from reality it almost hurts... They get points for taking it apart and all, but they could have reached the correct conclusion merely by reading up on the existing specs of the Lightning port (if they had bothered to add a bit of digital-video knowledge from Wikipedia that is).
- Jesper
Re: (Score:3)
The HDCP side would most definitely not require that. It's a stream cipher, so aside from any buffering you might do if your HDCP solution was software rather than hardware (which would actually still seem pretty difficult to do even with a fairly stompy processor), it needs less than a kilobyte. The other side, who knows?
Re: (Score:3)
No, absolutely not. Then again, the the device has 2 Gb of RAM, not 2 GB. Or in more clear terms: 256 MB. They just don't know how to read the numbers on the chips properly.
And it is not only the DRM, it is (likely) converting one digital video encoding to another - also called Transcoding [wikipedia.org].
- Jesper
Re: (Score:3)
It is correct that HDMI generally transmits uncompressed video, but it is absolutely not raw (or RAW) and it is encoded (as virtually all digital signals are, especially audio and video) ... :-)
Video transmitted over HDMI is encoded using Transition-minimized differential signaling (TMDS) [wikipedia.org] which is a variation of 8b/10b encoding [wikipedia.org].
Also, "grids of pixels" are not transmitted. I have no idea who told you that but don't listen to them/him/her. In a HDMI video signal, one line is defined as the "active" line, and
I can't believe no one has asked yet, (Score:4, Funny)
I would have had first post... (Score:4, Funny)
...except my Apple ethernet cable needed a firmware update.
There is no Airplay involved. (Score:5, Informative)
Airplay is not involved in the operation of this adapter.
It is true that the kernel the adapter SoC boots is based off of XNU, but that's where the similarities between iOS and the adapter firmware end. The firmware environment doesn't even run launchd. There's no shell in the image, there's no utilities (analogous to what we used to call the "BSD Subsystem" in Mac OS X). It boots straight into a daemon designed to accept incoming data from the host device, decode that data stream, and output it through the A/V connectors. There's a set of kernel modules that handle the low level data transfer and HDMI output, but that's about it. I wish I could offer more details then this but I'm posting as AC for a damned good reason.
The reason why this adapter exists is because Lightning is simply not capable of streaming a "raw" HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. Contrary to the opinions presented in this thread, we didn't do this to screw the customer. We did this to specifically shift the complexity of the "adapter" bit into the adapter itself, leaving the host hardware free of any concerns in regards to what was hanging off the other end of the Lightning cable. If you wanted to produce a Lightning adapter that offered something like a GPIB port (don't laugh, I know some guys doing exactly this) on the other end, then the only support you need to implement on the iDevice is in software- not hardware. The GPIB adapter contains all the relevant Lightning -> GPIB circuitry.
It's vastly the same thing with the HDMI adapter. Lightning doesn't have anything to do with HDMI at all. Again, it's just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we've already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.
This system essentially allows us to output to any device on the planet, irregardless of the endpoint bus (HDMI, DisplayPort, and any future inventions) by simply producing the relevant adapter that plugs into the Lightning port. Since the iOS device doesn't care about the hardware hanging off the other end, you don't need a new iPad or iPhone when a new A/V connector hits the market.
Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable. Given the dynamic nature of the system (and the fact that the firmware is stored in RAM rather then ROM), updates **will** be made available as a part of future iOS updates. When this will happen I can't say for anonymous reasons, but these concerns haven't gone unnoticed.
Re: (Score:3)
And this is superior to having an adapter that converts HDMI (or whatever the default output is) to 'protocol X" how exactly?
I mean, you still need hardware in the adapter and the adapter only has one output port so what's the advantage? It seems like utter pointless over engineering, the sort you get when a company loses touch with consumers.
Re:There is no Airplay involved. (Score:5, Informative)
The reason why this adapter exists is because Lightning is simply not capable of streaming a "raw" HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved.
The HD-SDI [wikipedia.org] standard can transmit a full, uncompressed HD signal over a serial connection. Why wasn't that used?
Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable.
Any level of compression artifacts introduced at this level is unacceptable. We understand that HD video has to be compressed to fit into a sane amount of space, but up until now all cable formats have been lossless – this is a regression.
And why does your marketing literature say 1080p output when that is clearly not true?
Am I missing the point? (Score:3)
Really, I wonder if I'm missing the point of this outcry. I think putting a chip on the dongle to offload the decoding of video streams to HDMI to be a really elegant and scalable solution; possibly even scalable to the point of doing 4K video with a more powerful ARM chip.
The bandwidth of the Lightning connector itself is easily in the same range as USB 3 (10Gbbps) and the limiting factor is the hardware in the iPad/iPhone which is limited to USB 2.0 spec type speeds across that connector. There's no technical reason that future devices won't up that to USB 3 speeds, but the chipsets just aren't there yet. Once upped into that range, there's no reason that uncompressed 1080p video can't be pushed through that interface (approximately 3Gbps at max throughput, FYI). Again, the limit isn't the Lightning connector but rather the chipsets in the current range of devices. I don't have all the specs off-hand, but it's quite likely that the Lightning connector is actually capable of faster speeds, but the standards for that don't exist yet.
Besides, this is where I think I'm missing the point: Why the hoopla? This is a consumer-grade device (iPad/iPhone) and we have some guy who's got his ass chapped by the fact that it can't output uncompressed 1080p video through it's current connector? Uhm... OK. The 30-pin adapter got around this by having discrete video output on its own pins... the Lightning connector is purely a data connection. Yes, this change to only being able to get compressed video out at USB 2 speeds does seem to be a bit of a step backward, but again this is a consumer-grade device and should be treated as such. If you're using it for playback of video that must be 1080p in all it's uncompressed and perfect glory then you're really missing the point and probably need... Oh I don't know... a laptop with HDMI out? Or Thunderbolt if you're really an Apple fan?
I'm not an Apple apologist; I am typing this on an Alienware laptop running Ubuntu and my phone is a Galaxy Nexus... yes I have a Macbook Pro as well and it's a great laptop, but I in no way a fanboy. I just realize that this is a pretty elegant solution that's really scalable and interesting... but you have to remember this is for a consumer-grade device, not professional. This is for displaying your holiday pictures on a big screen, or playing back your holiday videos to the great chagrin of your friends... this isn't for reviewing takes between shots of the latest movie blockbuster... Apple sells better hardware for that. So do many other manufacturers.
All that being said; there's no technical reason that future generations of the iPad/iPhone won't be able to output 1080p uncompressed through Lightning... the limit is not the new connector but rather what the device itself can output. This dongle design is actually a really good idea... faster ARM CPU in there and you've got massive scalability.
Re:Airplane mode? (Score:4, Informative)
Bizarrely, MHL (which also has 8 or 11 pins depending on whether your device comes from Samsung; the connector is not part of the standard) can do 1080p HDMI while having much cheaper (and probably much simpler) cables to boot. It appears that either Lightning is noticeably inferior to MHL or Apple just managed to badly screw up the adapter.
Re:Stop the presses! (Score:5, Informative)
Wow, not only did you not read the article, you didn't even look at the pictures, did you?
Stop the presses! The are scaling 1024x768 content to 1600x900,
The cable is advertised as doing "up to 1080". It does not.
and there are MPEG artifacts happening as a result?!?! The deuce you say! There's never artifacts when you scale things! Never, I say!
Did you look at the picture? Those are not scaling artifacts: there is noise around edges. Those look like artifacts from MPEG or a similar compression algorithm. If it was just scaling, it would introduce aliasing patterns, which is not what they are talking about.
Next thing I know, you'll be claiming that Apple didn't replace all the already transcoded content on the Inktomi CDN with new, higher resolution content over night!
What does that have to do with this discussion?
It's almost already too scandalous that they used a CPU and software to avoid having to design and spin silicon for a Lightning-to-HDMI converter ASIC.
In fact, it looks like they did create an ARM-based ASIC, which on the face of it is bizarre to find in something sold as "an adapter cable". It's obviously doing something much more than or quite different from your standard adapter cable.
I can only echo some of the sentiments expressed in the bad ratings they received in several reviews from owners of Samsung Televisions which improperly negotiate EDID information by failing to negotiate on input sources which are not selected at the time the device comes online. One would almost think this might be an issue for Linux systems when trying to use HDMI to output to Samsung equipment, or that Dish Network DVRs might have similar problems (with the fix being to plug the device into the input channel which is selected by default when the television is powered on).
EDID? Linux? What? The article doesn't mention those topics at all. It's talking about an ARM-based chip that was unexpectedly found in a new model of a supposed "adapter cable" from Apple that is providing results that are substantially inferior to what was available on older models of Apple's similar products. As a result, if you use this cable to attach your iWhatever to a TV, you get laggy, downsampled, artifact-laden video, where Apples previous products and products from their competitors deliver sharp, un-transcoded 1080p video.
Re: (Score:3)
Wow, not only did you not read the article, you didn't even look at the pictures, did you?
I looked at the pictures. I saw artifacts from scaling 1024x768 4:3 aspect ratio content to 1600x900 16:9 aspect ratio content from source material encoded at 1024x768, with intentional watermarking to identify the iTunes account that the data was pulled down from. Do you often watch your television with a microscope?
The cable is advertised as doing "up to 1080". It does not.
I'm not sure I buy the information in the blog post. Specifically, the thing that drives the EDID negotiation is the display device; it states what resolutions it supports, and the device dri
Re: (Score:3)
Stop the presses! The are scaling 1024x768 content to 1600x900, and there are MPEG artifacts happening as a result?!?! The deuce you say! There's never artifacts when you scale things! Never, I say!
Ahh I see. You must be from marketing. How clever of you to put a positive spin on the story that the result is they now must scale 1080p down to 1600x900, a notable step backwards from their previous design which could do native 1080p uncompressed.
Maybe it's not marketing, maybe it's just reality distortion.