Despite everything going on in the frame, the video still has a passive experience to observe. You can not achieve and mess with the object you are tracking-until now. An MIT researcher has pioneered new technology allows you to "touch" things are recorded, is modeled to meet as you've fiddled with them in the real world.
So how do you can predict which way an object was stored will move when tuning? The system, called "Dynamic Interactive Video" (IDV), demand in less than a minute of footage to track the ability to move. It does so by analyzing how it changes when deliberately jostled: in the video example below, a researcher slams table on which rested a human shape, allows the system to see how it vibrate on different frequencies. Then, it extrapolates position should behave like when people see their cursor to the object in the video and jostle.
Obviously this can have many applications in entertainment, education, but the researchers wanted a more local example-so, they put into the system in the augmented reality Pokémon. With IDV, automated environment reacts to the movement of pocket monsters, making it seem like the wee beast is really bending leaves and grass as they bounce around.
The simulation is not perfect, excess received his doctorate Abe Davis MIT students in the first video, but more information will enhance the precision movements of the production model IDV. Complete with highest accuracy, it can be used to evaluate the structural integrity of the building. But it also was used in film special effects, tweaking the actual object, a computer generated characters interact with the screen.
đang được dịch, vui lòng đợi..