Cheap 3D Motion Sensing System Developed At MIT 60
Al writes "Researchers at the MIT Media Lab have created a cheaper way to track physical motion that could prove useful for movie special effects. Normally an actor needs to wear special markers that reflect light with numerous high-speed cameras placed around a specially-lit set. The new system, called Second Skin, instead relies on tiny photosensors embedded in clothes that record movement by picking patterns of infrared light emitted by inexpensive projectors that can be mounted in ceilings or even outdoors. The whole system costs less than $1,000 to build, and the researchers have developed a version that vibrates to guide a person's arm movements. Watch a video of Second Skin in action."
Tracking fidelity (Score:4, Interesting)
The tracking fidelity from the video seems low. For movie work you need a very smooth input, otherwise you end up spending a lot of money to smooth out the positional data which has the side-effect of making it look more artificial and robot-like.
What I do like is the use of projected patterns to track individual dots, that's pretty clever. But it seems like this won't be the final solution. Ultimately we're going to need to perfect a micro-GPS system, and that has many more applications than just use as movement-capture for movie production.
Re:Tracking fidelity (Score:4, Informative)
The video on the SecondSkin web site says it captures 5000 frames per second. I think the slowness you perceived in the feedback video was due to the feedback software, not the capture system.
Re: (Score:1)
The video on the SecondSkin web site says it captures 5000 frames per second. I think the slowness you perceived in the feedback video was due to the feedback software, not the capture system.
How can it capture at 5000fps when projectors that give it a point of reference work at only 1000fps? besides its jerky
Re: (Score:2)
Mmm, frames per second and fidelity are two different things, much like performing 5,000 calculations per second is one thing and performing floating-point calculations versus non-floating point calculations. It's like you're saying it's a video camera that takes 5,000 FPS and it's shiny, and I'm looking at the features-tag that says it takes only 1 megapixel resolution. We want, we need, more resolution. Especially when it comes to mo-cap for feature films where the slightest jitter is extremely noticeable
Re: (Score:3, Insightful)
Is video too complex to allow the sort of math we do on audio? In the audio realm, most ADCs are natively 1-bit converters with a ridiculously high sampling rate (MHz). That turns out to be mathematically equivalent to, say, 24-bit audio at 192KHz.
But audio's a single waveform, and video's a collection of pixels, so I guess it's all different.
Re: (Score:2)
Re: (Score:2)
Oh... duh. This is why I'm no good at math. Thanks for the explanation.
Re: (Score:2)
Re: (Score:1)
Hey, watching movies is *fun*.
There's nothing like a good film to (temporarily) take your mind off reality.
Re: (Score:2)
Hey, watching movies is *fun*.
There's nothing like a good film to (temporarily) take your mind off reality.
Eh, but what is reality?
More appropriately I think would be to say, rather than to escape reality, would be that living vicariously through movies is entertaining. We generate work through our own desire for entertainment and luxury, otherwise must of us would be out of work cuz all we really need to survive is food and shelter and we're so efficient at those things that only a few can sustain very many. Entertainment and enjoyment of life brings meaning to it aside from mere subsistence.
What I mean
Re: (Score:2)
because goodness knows in these troubling times, our society needs to concentrate our technological progress into the betterment of movie special effects, and a better cost structure for producers of action blockbusters.
Yeah, you wouldn't want people spending tons of money on frivilous things during an economic downturn.
Re: (Score:2)
Amen! We should all be suffering! What is wrong with all these people with their fun games and entertainment, don't they know we're in a recession?
Re: (Score:1)
Speaking of movie special effects.. anyone else see a striking similarity to the ractives in Neal Stephenson's book Diamond Age? The next step is sensors permanently embedded under the skin..
Re: (Score:1)
Hence "from the let-the-ractives-begin dept" d'oh
Second Skin? Unfortunate name... (Score:3)
When I saw the name of this, I immediately thought of Second Life.
Second Skin takes over Second Life!
Oh, the humanity! [or lack of...]
I bet the pr0n industry could have fun with this...
Re: (Score:1)
Huh. The first thing I thought of was Crown Skin Less Skin condoms. Which are already very popular in the porn industry.
-Peter
Re: (Score:1)
Already been done [secondskinlabs.com].
Combine with VR Cave! (Score:2)
Combine Second Life and Second Skin with virtual reality "cave" technology [digitalcon...oducer.com] and you have a low rent holodeck. Use it to interpret gestures like the Wii does, and yes, you have a revolution in cybersex and interactive pr0n.
I say it's a buy! Someone is going to make many millions on this. (Especially if they invent a Bluetooth API for optional teledildonics [wired.com].)
WiiHD? (Score:2)
Re:WeeeeeeHD? (Score:2)
...relies on tiny photosensors embedded in clothes... ...and the researchers have developed a version that vibrates...
Someone will work the system into porn and THEN we'll have a video game that is REALLY addictive!
No more "balls in your face" jokes (Score:4, Funny)
Re: (Score:3, Funny)
Re: (Score:1)
Re: (Score:2)
It'll sure have a huge impact on movies being made by five friends with whatever effects they and their buddies can put together! Hack together your own mo-cap studio for a couple thousand, and the amount of stuff you can do goes way up.
Also:
Re: (Score:2, Offtopic)
Re: (Score:2)
How exactly are radio/wifi/wimax/bluetooth at all relevant in relation to a motion capture camera?
It's not. It sounds like he's confusing the use of IR with an IRDA port on a laptop. (BTW: His question wasn't off-topic. He asked an interesting question.)
IR is used to illuminate the balls on the mocap suit so that the cameras in the volume see little else but bright white specks to track. They use IR in particular because they can make those balls really bright for the volume, but still retain normal lighting on the stage. Besides not requring actors to act in the dark, they also do this so they can
Re: (Score:2)
I honestly don't know why IR is chosen over radio.
It bogs down the actors with either cables or batteries. Also, radio requires a smaller volume and requires such a high frequency that it ends up becoming line of sight, anyway.
In short: It's inferior to IR and optical capture in real-world scenarios./
Re: (Score:2)
Mobile Phones and GPS would like a word.
Both have been working for years without many problems.
You don't strap 40 cell phones to a human and expect them to, at 60fps, provide accurate position and rotation data down to the millimeter. That's about like saying "we landed a man on the moon several decades ago, there's no reason we can't get people to Mars."
It's a more difficult problem than it looks.
Re: (Score:1)
I envision establishing a "box" of eight transmitters (that many isn't technically necessary, but might provide more accuracy and error-correction; initial though
Re: (Score:2)
What he might have meant is using a technology where instead of looking at where sensors are from the outside, use sensors that transmit where they are via radio or something. There is some sense in that, provided the technology exists. I saw demos years ago of a suit sort of like that. I don't know if it transmitted translational data (as opposed to just rotational data...), but even if it did, there was a nice big cable coming from the actor to a computer somewhere.
Just get a spandex suit with tiny RFID tags embedded in it and build an array of receivers in the studio in fixed positions and do the equivalent of GPS triangulation on the smaller scale. Record the data and do the math later to whatever accuracy you need (you're not locating yourself on Earth so civilian GPS hardware limitations don't apply). Meanwhile, your actor is able to wear normal costumes on top of the spandex suit and you'll be able to augment his performance on a more practical set. Maybe even sh
Re: (Score:2)
Make it work in a 70' square volume with 18 actors moving in it and patent it.
Re: (Score:2)
I gave up on the idea of patenting it. It's too obvious. All the barriers to doing it are easily discoverable and the solutions too few and thus obvious to survive a patent challenge (such as coming up with a powerful enough signal to drive the RFID tags without violating FCC regs and not interfering with the reception of their signals, getting them all to uniquely identify both by identity and elapsed time from their driving signal (containing its own clock signal), optimum placement of receivers are a mat
Re: (Score:2)
What I'm trying to say is 'easier said than done'. The IR technology has gone quite a ways. I think they'd all love to have something that's just as capable without occlusion problems or expensive solutions. Really, though, that many actors in that big of space with a minimum of 40 markers each, that's a tall order no matter which technology you use.
Don't get too excited yet (Score:1)
We know what they were thinking! (Score:1)
One word: autopilot.
(Ironically, my captcha was "females")
The problem is... (Score:2)
That since most of the cost resides in doing something useful with the data (completely producing the images), the time and talent of the people that are _in_ the suits, etc, the producers really don't give a frak that their motion capture system costs $1000, $15000 or even $100000. What they want is something that is proved to work, that technicians are familiar with, and that you can readily rent by the hour along with the facility it's located in. So thank you Media Lab for another useless gadget.
Re:The problem is...no problem after all. (Score:2)
Parent's attention is fixed on the existing moviemaking structure and is not directed to alternative distribution and creation channels. Those alternative channels are the wave of he future. The cheaper production gets, the more opportunity we'll all have for a greater array of diverse movies.
Someday a truly independent movie is going to hit it big via reasonably independent internet distribution. That will change everything. Technology like this only makes that day closer to reality!
I say hurrah!
Re: (Score:3, Informative)
Re:The problem is... (Score:4, Insightful)
There are many small and medium sized game development houses who would love an inexpensive motion capture system in order to capture data for things like in-game cut-scenes. And to them, yes, it makes a pretty big difference whether a system cost $1000 vs $100,000. Having to rent a studio by the hour is also pretty damned expensive.
Besides which, it seems foolish to offhandedly dismiss new technology such as this before it's had even a chance to develop into a useful product.
Re: (Score:2)
Perhaps, but you're thinking small time, here. If the price of a good-enough Mo-Cap system got down to $1,000, do you know what that means??? That means that there would certainly be some hobbyists taking this home and experimenting with it. When that happens lots of fun things can result.
Like a WiiMote! (Score:5, Interesting)
What's interesting to me is, this is almost exactly how the WiiMote works so cheaply!
A lot of people assume that the Wii's sensor bar actually senses, and that it can tell where the WiiMote is. But that ain't so. The sensor bar is just a pair of IR emitters. The front of the WiiMote is an IR camera. The thing you hold in your hand is looking at the external IR sources and using those to try and figure out where it is, and then telling that to the base system, almost exactly as is described in this article.
It's like someone said "hey, let's do motion capture by gluing WiiMotes all over a person's body!".
Re: (Score:1)
A similar technique has been used to calibrate the image of a projector to a surface. Here is a video: http://blog.makezine.com/archive/2008/04/automatic_projector_calib.html [makezine.com]
Re: (Score:1)
Wiimote: Motion vs. Position (Score:2)
the Wiimote definitely [senses]. [...] [parent also implies that all Wii motion-sensing is done with IR and that isn't the case. The Wiimote has an accelerometer that can detect movement on 3 axes.
Keeping up with "your post doesn't contradict this", I want to add:
The accelerometers sense differential data (motion), whereas the IR camera senses static data (direction towards IR light).
If you assume that there are only two infrared sources out in the world (in either end of the sensor bar) and they don't move, you can use your camera reading to infer your angle in the horizontal plane as long as you can see the infrared sources. Using that, plus the strength of gravity at different points on the wiimo
Missing the point? (Score:1)
System looks flawed (Score:2)
Obligatory (Score:1)
$1000 of blister-healing goodness! And at 5000 fps!
http://en.wikipedia.org/wiki/2nd_Skin [wikipedia.org]
Here's how it works (Score:2)
It relies on cycling a repeating pattern from every projector 500 times/sec. Every pixel in the pattern encodes a unique symbol by the colors & the changes in the colors over time. By sensing what symbol hits each sensor, you know what pixel from the projector is hitting the sensor & what position on the projector's XY plane the sensor is in. If you know the XY plane position from 2 projectors, you can triangulate the sensor's 3D position, but projectors with enough resolution & bandwidth to
Johnny Lee's method? (Score:2)
I wonder if it was inspired by Johnny Lee's automatic projector calibration system [youtube.com] (from 2003) which uses a very similar method. (Yes, that's the same guy that does the Wiimote hacks)
Keeping the projectors focused (Score:2)
So how do they keep the projected patterns in focus as the actor moves towards & away from the projectors? What if you want to track a close actor & a distant actor simultaneously? Those projected patterns aren't going to be in focus & the sensors won't know where they are.
Gray code patterns (Score:1)
They seem to use Gray code [wikipedia.org] sequences (only one bit differs between to neighbouring codes). Johnny Chung Lee (the Wiimote Whiteboard guy) already demonstrated the use of structured light and optical fibers [johnnylee.net] in his thesis. He used it to rapidly locate projection surfaces.
Using sensors and gyros... (Score:1)
Don't about the real performance of the technology but the idea in itself seems to enable some freedom (no need for interior studios, less expensive).