Just amazing! Hardware startups are beyond sexy. There is something so compelling about tactile devices... no amount of sophisticated software engg can come close to that feeling. Wipro's chairman was once asked why wipro still made soap & shampoos even though wipro was a billion dollar software services shop. He just picked up a bar of soap and said - its just very physical. Software, you can't touch, you can't see...
In the early 90s, there was this thing called a centronics port. You looked behind an old dot matrix printer, you found this large ugly parallel port with 25 pins ( http://bit.ly/YT4xRh ). I asked my professor "how does the printer print ? ". He took my question at face value, hooked up the centronics port to a CRT...soon I realized pins 2-9 would output any goddamn digital signal ! Literally anything I wanted. I wrote a quickbasic program to print square wave, sine wave sawtooth... and those waves started showing up on the CRT. So it got me thinking...if I feed that 8 bit output to an 8-bit TI DAC, I can make an analog function generator! ( http://bit.ly/WmFy8o ) My final version needed a 741 op-amp & a 555 timer, but it actually worked over a wide range of frequencies, and was about one hundredth the price of a real Kenwood function generator! Ofcourse, back in those days, there was no YC or startup culture, & that was the end of that device...:( But just thinking about that one hardware device gives me more pleasure than all the software I wrote after graduating.
Yes hardware startup are sexy, but most of them are also very software-intensive. Leap Motion needs a lot of processing power, even for a simple visualizer app, and I bet Myo does too.
Other examples: guidance of things like the Sphero or the Parrot AR Drone are very complex. Usually there's a lot of maths involved, which has to be translated into software.
Electrical input from the body is extremely noisy. I wouldn't be surprised if they were using machine learning to train their algorithm to recognize which electrical patterns correspond to which muscles being activated.
Your last sentence does paint a pretty clear picture - get back into hardware building, you said it yourself, we do now have YC and the startup culture.
> The first generation can recognise around 20 gestures, some as subtle as the tap of a finger
I wonder if it's able to distinguish between fingers. If so, it would be possible to create some sort of chorded keyboard, without the keyboard of course. Combine with Project Glass and you have awesome sauce.
The MYO senses gestures and movements in two ways: 1) muscle activity, and 2) motion sensing. When sensing the muscle movements of the user, the device can detect changes in hand gesture right down to each individual finger."
I've worked extensively with sensored gloves in my job, both designing the gloves, and developing software to do cool things with them. By far the most difficult thing to do is make a compelling application that effectively utilizes the capabilities of the hardware. This is a very cool piece of hardware, but it's success is going to be defined on whether or not they are able to make it better than existing input devices, which tend to be quite good already.
I'm terribly excited about this interface. I've been hoping that this is right path towards a Rainbow's End(http://en.wikipedia.org/wiki/Rainbows_End) kind of interface.
Having your arms hanging loosely or with the elbow support while gesturing seems to be the right way to take advantage of human fine motor control while manipulating a digital interface. It is a good workaround to Gorilla arm.
I suspect that the 20 year interface that Gabe Newell is expecting will use this kind of input.
This is cool, but I couldn't believe their video included the skiing example. As an avid skier, I can't think of anything more life threateningly dangerous than messing with menus to share a video while you're sliding down a mountain.
I get that it has nothing to do with armband itself, it just seems like a very poor choice. I cringed.
The skier used it at the top of the run and once at the bottom of the run to Share when the jump was complete.
Also, you should note if it works as well as is claimed, it would be every bit as fast as every other gesture in the human body. So, its not like you're fiddling with a touchscreen or mousepad while moving on mountain...
I haven't been so excited in a startup's product in a very long time. I'm constantly tapping out rhythms with my fingers - often quite pleasant and complex ones - and for whatever reason I've never been able to do the same onto any kind of midi device - a keyboard or a set of drum pads or whatever. I would love to give this a try as a midi controller!
Ok, this is what I think we will see a lot of. No talking, no vision, no touching, just some simple gestures.
Dave Rosenthal (former MIT/Sun/nVidia not archivist guy) and I talked a bit about gesture control when the first MEMS accelerometers came out. They allowed for a wide range of controls but they drifted horribly so it wasn't possible to do a sort of "hold for fast forward" kind of motion. You had to have a move to start and a move to end.
To some extent this is the same problem as the Leap which can know where your fingers are but you can't hold a gesture and move your hand (afaict, I've not seen the LEAP SDK in action yet)
First thing something would do with this: integrate it with Google Glass. While nodding your head or tapping a touchpad on the side by your temple might be awkward, discretely swiping up and down with your had would be more natural and less obvious.
Just a thought in this field of motion controls - could be cool to use gloves/finger/points of contact tapping as part of the motion/gesture control aspect - ie: tapping my thumb and forefinger together is 1 gesture, thumb and middlefinger another, thumb and ringfinger etc. So on top of using motion and speed, you also detect commands when points connect together.
Depending on the sensitivity of this, the potential is endless. There are many multi-finger gestures that would work very well in three-dimensional space that have not yet been exploited with current technology. I'm excited to see what will come of this.
Just preordered one, if it's as good as it looks on the video the possibility to build amazing things are endless. As stated in the the NewScientist article, this combined with glass and some home control equip. Oh the future is now ! :)
Good observation! I always like these minimalist signature loadings.
I like the logo too, I see it as some kind of trans-binary trans-bi-state non-analog frequency peak-detector (0,1) state, not `ol tired familiar 0,1 on-off switch. Something new!
This one indeed looks better, but the thing that caught my attention was their vision of enabling humans to do more with computers, to enhance their abilities (Finally no more of the BS that touch is the way forward...). I feel this vision is going to be their killer app (eventually); and one of the best ones to have come out of YC recently.
Given that the technology in that film was designed by actual interface researchers, based on their research, it's no surprise that it continues to resonate so well. Also, sci-fi does have a tendency to inspire, or at least be inspired by, the leading edge of technology.
Said researchers now have a company, Oblong industries, that's basically making the Minority Report interface a reality: http://oblong.com/
Wow. This is the future of human/machine interface (until we get direct brain/machine interfaces (BMI) working with fidelity and no/little invasiveness). I really hope the actual product works as well and as smoothly as in those demos.
Can you ensure that common activities such as typing, writing, stretching etc. won't be registered as intended triggers for actions? Anyway, amazing product! I can totally see myself using and developing with one!
I wonder if they've looked into adding an NFC chip into MYO. It may seem crazy for a gesture control armband but it could add potential functionality to the armband and also improve communication among the masses.
Honest question : Is this area a patent minefield? Or are the Myo guys busily sewing it all up? (I'm glancing nervously at the may-be-more-tricky-than-anticipated 3d printing field, for instance).
`k people, I wanna see a coupla-three dev teams getting these patent specs sewn up before Myo flips behind Apple's iwatch wristband wall.
1. Permute & Enumerate Discrete Associative Layer(PEDAL) Group. Enumerate all normative ergo kenesiologic mapping of discrete hapt events onto an associative array we can MTF onto task expectation frequencies.
2. Dynamic Adaptive Nascent Cognitive Evolution(DANCE) Group. Get past the AA(Acknowledgment Annunciator), transparently and simultaneously entrain both user and device handshake onto evolving MTF surfacing situ specific command completion.
3. Comparitive Haptics Integral Nuance Advancement(CHINA) Team. Collide Leap, Myo, Kinect, Oculi; onto mobile, glass, watch.
That's it. Git'er done like yesterday, fellas. Now!
Does it suffer `Gorilla Arm' as others? Is it necessary to keep one's arm aloft for hours to get in a day's work? It certainly looks exciting for momentary input.
In the early 90s, there was this thing called a centronics port. You looked behind an old dot matrix printer, you found this large ugly parallel port with 25 pins ( http://bit.ly/YT4xRh ). I asked my professor "how does the printer print ? ". He took my question at face value, hooked up the centronics port to a CRT...soon I realized pins 2-9 would output any goddamn digital signal ! Literally anything I wanted. I wrote a quickbasic program to print square wave, sine wave sawtooth... and those waves started showing up on the CRT. So it got me thinking...if I feed that 8 bit output to an 8-bit TI DAC, I can make an analog function generator! ( http://bit.ly/WmFy8o ) My final version needed a 741 op-amp & a 555 timer, but it actually worked over a wide range of frequencies, and was about one hundredth the price of a real Kenwood function generator! Ofcourse, back in those days, there was no YC or startup culture, & that was the end of that device...:( But just thinking about that one hardware device gives me more pleasure than all the software I wrote after graduating.