Monday 11 October 2010

MomoLondon: Demo Night

I’m catching up on blog posts slowly. This one’s the Mobile Monday London Demo Night from 13th September… Together with Ewan and Sergio (manfully representing FuturePlatforms), I was presenting my Over The Air hack from the weekend (see previous post) and I was consequently suffering from a lack of sleep. Here’s a quick run down of the other presentations:

psonar

http://www.psonar.com/

  • upload music to psonar
  • get at music on any HTML5 connected device by streaming or downloading

roulette cricket

http://www.roulettecricket.com/

  • great idea
  • impressive implementation by FuturePlatforms, as usual…
  • £2.50/month to play for prizes, or a revenue share from bets

HipSnip

http://hipsnip.com/

  • mobile notepad for the things you want
  • enhanced with smart searches & geolocation:
    • wine.com, cnet, imdb, yelp
  • mobile design a little too small for hands… need to follow Apple HIG on Android too :-)
  • user research: found that people take photos, or save a draft SMS but then forget
    • wanted to have a way to remind people

The logo is an aleph. Is that for infinity or for hebrew? The letter carved into the head of the golem…?

Toshl

http://toshl.com/

  • personal expense record
  • quite secure
  • have apps for android, iphone & maemo
  • sync to web
  • on web, can analyze, export
  • currently 45,000 users, 1/3 active
  • iPhone downloads in 3 weeks almost surpassed Android’s 6 months
  • are in negotiation with banks for auto-download
  • currently have a freemium model, looking for investment from banks for rebranding
  • some great observations:
    • RIM apps generally look ugly
    • front end and usability on mobile apps takes 50% of development time
    • cross platform development is nose bleeding expensive!

Collinson Group

http://twitter.com/mtjmelody

  • Columbus Insurance is one of their brands
  • a mobile trip planner
    • but with a surprisingly comprehensive offline database of airports, flight plans, hotels and car hire
  • had a desktop version for quite a few years, just brought across to Apple platform

Appmarks

Adrian Cuthbert

http://www.myappmarks.com

  • bookmarks in that point to specific places within apps
  • want to add Android version too
  • if app not available, offers to download it from store
  • would like apps to supply relevant info

Milestone Planner

Benjamin Ellis

http://milestoneplanner.com/

  • need a plan, but things change very fast
  • wanted a plan on a page that everyone could update
  • webapp written with HTML5 and javascript
  • server keeps a full history of all changes
    • can view as list in webapp, RSS, export to twitter
  • facilitating a discussion around the plan
  • automatically generate weekly reports in PDF
  • most planners are based on TODOs, this is based on outcomes and people instead, larger scale

Qootia

tomaz@qootia.com

  • big screens that people can interact with by making voice phone calls
  • good demo, with great timing from the presenter

Liquid Air Lab

http://twitter.com/spodtronic

  • 480 apps developed on most popular smartphone platforms
  • 7.8 million users
  • 33,000 downloads per day
  • Android users are little more techie
  • BlackBerry users especially in the US
  • Nokia user… is catching up fast… !?!?
  • Ovi store has overtaken iPhone share
    • (since they have a special category for Radio apps…)
  • iPhone is going down in share compared to the others
  • started on Symbian and moved on to iPhone, etc
  • now have 7 platforms…

Sunday 10 October 2010

Over The Air Hack - a LEGO robot with an iPhone brain

IMG_6032

As promised, here are the details of my Over The Air hack — The Eyes Have It — which won prizes for “Best Hardware Hack” and “Best Use of Other Features” in the overnight hack-a-thon competition. I also took it along to Mobile Monday London’s Demo Night on the following Monday.

The hack was a LEGO Mindstorms robot with an iPhone brain that followed faces in front of it and steered towards them. It performed well on stage, following me as I gestured towards it, just like a small pet.

The hack was composed of two parts: an iPhone app that detected faces in the video feed from the front-facing camera; and the LEGO robot that took instructions from the iPhone and steered accordingly.

Communication

Getting the two parts to communicate was one of the trickiest areas to get right, and caused extra headaches before each presentation. Both LEGO Mindstorms and the iPhone can communicate over Bluetooth, but Apple has restricted Bluetooth communication to companies that will pay the Apple “Made For iPhone” license fee or that use particular hardware. Bluetooth comms is not available through the standard SDK and apparently needs some kind of “secret handshake” to work.

Unfortunately, Mindstorms came out a while before the iPhone and uses a different Bluetooth chip; and LEGO and Apple haven’t managed to do a deal to provide Mindstorms access from the iPhone (perhaps because LEGO has open sourced much of their software?). LEGO has now released an Android app to showcase mobile phone integration, so let's hope Apple can work with them to get some iPhone apps too.

So no Bluetooth, and Apple won’t let you talk over the dock connector either… That left feeding information to the robot through one of its five senses — a touch-sensitive button, an ultrasound distance sensor, a microphone, a light sensor or the motors themselves (which can detect rotation).

A quick search on LEGO Mindstorms iPhone brought up the iPhoneRobot which used the light sensor to pick up different greys on the screen. This was a great start, but it wasn’t quite what I was looking for. For a start, it required the LeJOS firmware on the Mindstorms brick — this is a cut-down Java VM that replaces the built-in LEGO firmware. I’ve left my Mindstorms with the default LEGO firmware as I use it with my 7 year-old son. He’s not quite ready for Java, but can easily understand the Labview-based visual programming that comes with the Mindstorms kit. Secondly, the robot itself wasn’t quite suitable — I wanted a robot that would recognise faces, so I needed the iPhone to be pointing upwards towards people’s heads rather than along the ground.

Motor skills — building the robot

Designing a new robot from scratch takes quite a while and I only had overnight. Unlike normal LEGO with bumps and holes, the Mindstorms kit uses the new-style LEGO Technic, which is mostly holes and connectors. Also, LEGO’s own sample models are pretty complex, as they are built to look like animals or people as well as interact.

Fortunately, I found a great site with designs for quick Mindstorms models that are easier to hack to do what you want. Some of the ideas are amazing — especially a Segway that actually balances! I started with the 3-Motor Chassis and added the distance sensor on the front to prevent crashes. I also added the button sensor on the side to help with starting and stopping the program, as the iPhone was in the way of the program buttons on the Mindstorms block.

Here’s some pictures of the final result, holding the iPhone pointing upwards. I’ll try and generate some build instructions later.

Computer vision — the iPhone app

Humans are exceptionally good at seeing faces. Our brains have been trained from birth to detect and analyse faces very quickly. We can tell which way people are looking from far away and even see faces in random patterns.

Computers have a harder time of it, although recent developments have massively improved what is possible. Companies such as Polar Rose have shown demonstrations of both face detection (finding out where any faces are in an image or video) and facial recognition (matching the detected faces against a database of known images) running in real-time on mobile phones. Unfortunately, their code is not available to an overnight hacker, though they’ve recently been bought by Apple so we may see interesting capabilities in future iPhones.

However, Intel launched an open source project back in 1999 called OpenCV (for Computer Vision) and not only is it still going strong, but the library is easily compatible with the iPhone and has a git repository of a ready-built library. OpenCV is all about providing well-optimised functions for real-time computer vision, so that developers do not have to reinvent the wheel. It includes face detection algorithms and a guy called Roy has posted some examples of how to get face detection working on an iPhone video feed.

Roy’s sample code was written for iOS 3, and iOS 4 provides much easier methods for accessing the video feed from the device. I updated Roy’s code to use the new AVCaptureVideoDataOutput class that provides direct access to uncompressed video frames from any iPhone camera. This bit took a little while longer than it should have done, as the video feed is provided as landscape (you’re recording a video, right?) whereas its preview feed is oriented the same way as the camera. This was not obvious, and made worse by the fact that face detection algorithms do not work when the image is rotated by 90°… There was a point in the early hours of Saturday morning when I thought there would be no face detection at all!

Anyway, following Roy’s recommendations, I scaled down the input image and adjusted the parameters of the OpenCV face detection call. At the moment I transpose the image before sending it to the detection algorithm, but I suspect it would be faster to use a rotated Haar feature set (the bits that the algorithm picks out in each image to match faces). I also didn’t use Roy’s changes to use integer arithmetic rather than floating point — it turns out that the iPhone 4G has enough grunt to cope with the standard OpenCV code.

You can get the iPhone code from my github repository and try it for yourself. Note that it’s hardwired to use the front-facing camera at the moment. If you don’t have an iPhone 4 just change the AVCaptureDevice to point to the ID of the other camera and the rest of the code should still work (though possibly a little more slowly…).

Light and dark — the LEGO program

So now I had a robot base and an iPhone app that could see faces. The next step was to connect the two together using the light sensor.

LEGO provide a drag and drop programming interface for Mindstorms that lets you build up programs using blocks such as “move motor”, “wait for sensor input” and control logic of loops and if/else switches. It’s quite capable and makes simple programs relatively easy, but using variables and arithmetic can be a little cumbersome.

The main issue in getting the robot to drive was calibrating the light sensor, especially when each demo was under different lighting. After a fair amount of tweaking (some just minutes before presenting at Mobile Monday London’s demo night), the best results turned out to be when I crammed a small piece of cardboard into the hinge that held the light sensor onto the iPhone…

You can download the “.rbtx” file from my github repository, but for those who don’t have the LEGO software, the algorithm is essentially:

  1. Calibrate the sensor when pointing at the black and white squares to either side of the control square on the iPhone screen
    • The robot prompts for each reading with its display and waits for you to press the button between each sensor reading
    • The program reads the raw values from the sensor and calculates its own scaling values, as the built-in calibration routines turn on the light on the sensor — this works for reading black lines on a white sheet of paper but isn’t so good at reading the backlit screen of the iPhone…
  2. Wait for another button press to start the robot moving — so you can step back and make sure your face is in frame
  3. Read the raw value of the light sensor, convert it into a value between -90 and +90, and then steer that amount, then repeat
    • The program checks that the light sensor value is within a reasonable range before steering, otherwise the robot tends to go round in tight circles and you have to run round it like a lunatic trying to get your face in the camera frame!
  4. When the distance sensor picks up something closer than 6 inches, stop, play a sound and show a beating heart on the display (“I’ve found you!”)
  5. Start moving again when the button is pressed

To give you an idea of what this looks like in the Mindstorms NXT software, here’s a picture of the program!

Prizes

Thanks to Monotype for the beautiful poster of Gill Sans Bold Extra Condensed. They were going to give me two, but were happy to swap one for a copy of FontExplorer Pro instead, so I can see my digital fonts presented almost as prettily. Apparently, it’s now available for Windows as well as Mac OS X.