We are a team working on a project called Open Video Chat for the XO Laptop. We're a group of three students working on developing a video conferencing activity that runs fast enough to communicate using sign language. This requires a minimum frame rate of 15 FPS. In the spirit of open source we are also trying to use a free codec to achieve this.
Right now we've managed to create a hack that will send video between two XO's using an ad hoc wireless network. This method just UDP floods the other XO. This code can be found on our git repo (git://git.fedorahosted.org/OpenVideoChat.git) under the Openvideochat.activity directory. This method has not produced acceptable frame rates for our goal.
We'd like to get the activity running properly using the "correct" methods and pipelines. We think that the correct way to do this is to use farsight and gstreamer to manage the streams over telepathy. So far we're running into some walls. The XO's only support the Jpeg/smoke, schrodinger and Theora codecs. Neither of these seem to be ideal for this sort of application.
Just using Empathy as a benchmark we only get approximately one frame per second after a long negotiation period. Empathy also does not allow the XO's to communicate with other machines.
We look forward to input from anyone with experience with Sugar, Gstreamer, Farsight, or Telepathy that wouldn't mind lending us a hand. We will be attending the Python Hackfest in Boston this weekend April 16th. For general questions we can be reached at ovc@…
-OVC Development Team