Wednesday, December 4, 2013

Attaching a WiFi Dongle to an AR Drone 2.0 (a.k.a. Compiling Drivers for ARM Based Devices)

Correction (4/24/2014): for your driver to work the AR Drone 2.0 you have to compile the kernel version provided by Parrot which can be found here

One of the down sides of "out of the box" AR Drone 2.0 is that you can't control multiple drones from the same machine. Another down side is that you can't attach USB devices except flash memories and some other predefined devices due to the limited nature of the busybox deployed on the drone. To over come the first down side without manipulating the WiFi configurations of the built-in WiFi card (i.e. you don't want to mess up your new drone), you will need to overcome the second problem (i.e. install a WiFi dongle and its driver manually). In this tutorial, I try to walk you through the steps to do just that.

I assume that you are working a Linux machine (seriously we are not going to compile drivers on Windows). Ubuntu is preferable !

Because the Drone has an ARM processor, we will need to compile the driver using an ARM compiler against a Linux kernel compiled for ARM. The first step would be to Install arm compiler
:~$ sudo apt-get install  gcc-arm-linux-gnueabi
Then you will need to get and compile a linux kernel with the same version as the one on the ARM machine.  You can know that information using the uname command (in the case of AR Drone 2.0 that's 2.6.32).
:# uname -a
Linux uclibc #1 PREEMPT Fri Jul 20 14:10:11 CEST 2012 armv7l GNU/Linux
Notice the extraversion .9-g4190abb, we will need to edit the makefile of the kernel to match the extraversion. To actually get the kernel source code:
:~$ wget
:~$ tar xjvf linux-
Open the makefile and edit the "EXTRAVERSION" to whatever extraversion your ARM machine has. The next step will be to configure the kernel:
:~$ export ARCH=arm
:~$ export CROSS_COMPILE=arm-linux-gnueabi-
Then, for AR Drone 2.0, get the kernel.config file from here. Open the arch/asm/configs directory in the kernel and put the configuration file there and rename it to "ardroine_defconfig" and compile the kernel:
:~$ make ardrone_defconfig:~$ make

Now it's time to compile the driver. First, you will need to know which driver to get for the specific WiFi dongle you are using. To do that, connect the WiFi dongle to the AR Drone and telnet the drone:

:~$ telnet 

:# lsusbBus 001 Device 002: ID 050d:1102 Belkin Components 
Bus 001 Device 001: ID 1d6b:0002 
Search online for a driver that matches the vendor:id pair (050d:1102 in my case) which requires the RTL8188USC driver. After you download the driver's source code, edit its makefile to use

CROSS_COMPILE := arm-linux-gnueabi-
LINUX_SRC = location of kernel
LINUX_SRC_MODULE = (location of kernel)/drivers/net/wireless/

Then compile the driver which will produce a .ko file (e.g. 8192cu.ko). Copy that file to the ARM machine (in the Drone's case use FTP to copy the file to the drone) then navigate through the terminal to the file's location:
:~$ telnet
:~$ cd /data/video
:~$ insmod 8192cu.ko
If the dongle is plugged, unplug it then replug it and test whether it's working or not:
:~$ ifconfig -a
if a new interface is there, then congrats !!

Monday, July 15, 2013

Next Generation TV Ratings Systems

TV ratings conventionally depend on focus groups and/or set top devices that keep track of the channels and shows viewed. This approach has been the main source of information for rating TV shows, sports, news, etc and it has been an integral part of the economics of the TV industry. But with the growing sources of viewing TV programmes either live (using TV sets or online streaming) or later (using video on demand either through set top devices or online portals) and with the growing number of TV viewers posting their opinions about what they are viewing, TV programmes rating is become a more challenging task that requires innovative solutions.

On the other hand, recently booming terms like "Connected Viewers" and "Two Screen Viewing" describe TV viewers that use their smart phones or tablets while watching TV. These smart devices forms a rich source of information on the TV viewing habits of the device's owners. This information can help both the TV industry and the TV viewers. On one side, the TV industry can definitely use information regarding the TV viewing habits streaming in live from millions of mobile devices. In addition to normal ratings information, this new source of information comes with detailed preferences list and characterization of each programme viewer. This new information can help make better ads and make more informed airtime assignments. Moreover, web-based and mobile based advertisers (e.g. Google) can have more information about the mobile device's owner which means more information for their ads engines. On the other hand, the mobile devices' owners can make use of social applications and recommendations that are based on their TV viewing habits.

At the Wireless Research Center @ E-JUST we realized the potential of building such an application. The application enables tracking a mobile device's owner TV habits need to work passively collecting information about each programme and any online activity made while watching that programme (similar to tracking web-browsing history). While the detection of the programme playing on a TV is a well addressed problem and available for free by applications like IntoNow. The problem at hand here is more complex as we cannot assume that the user will activate the application each time he/she is watching a TV.

In our work (accepted at Ubicomm'13) we aim at analyzing the acoustic fingerprint and visual fingerprints of a TV set in order to determine whether a device's owner is viewing a TV or not. Our preliminary results showed a huge potential in using these two sensors (i.e. microphone and camera) to perform the passive detection functionality, allowing applications like IntoNow to identify the programme playing afterwards. 

Our novel stack of applications presents a new generation of audience measurement systems that can provide larger sets of more accurate and higher dimensional data about TV viewing. Also linking the TV viewing habits of a user to her viewing habits of online portals like youtube will give a clearer image of the popularity of TV programmes.

For more details check our technical report on arXiv: Mohamed Ibrahim, Ahmed Saeed, Moustafa Youssef, and Khaled A. Harras, "Unconventional TV Detection using Mobile Devices", arXiv:1306.0478.

Tuesday, March 26, 2013

Integrating Click Router and GNURadio

Recently, I have been working on integrating Click Modular Router to work with both WiFi cards and USRPs to facilitate the development of cognitive radio testbeds. I am using GNURadio to control N210 USRPs. I found that developing routing hops to be interesting and worth sharing as it requires the USRP to simultaneously send and receive. This requires the transmission and reception to be on separated frequencies. 

For transmission, first a client socket must be created in the click configuration code.

Socket(TCP, localhost, 4002, Client true)

Then, I used a thread to constantly listens on that socket and transmits the data once a new packet is received. This allows for the separation of tasks and good visibility of the transmission code. I noticed that there is a trend to include the transmission code within the main function which degrades the code readability. The thread takes as a parameter the top_block describing the data path on the USRP.

class tx_th (threading.Thread):
    def __init__(self,tb):
        self.tb = tb
        self.tx_soc = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        self.tx_soc_client,addrr = self.tx_soc.accept()
        self.pktno = 0
        self.pkt_size = 1500

    def run(self):
        print "Thread Started ..."
        while True:
                data = self.tx_soc_client.recv(1024)
                if data:
                    control_flag = struct.unpack('i', data[0:4])
                    if control_flag == 0:
                        target_tx_freq = struct.unpack('i', data[4:8])
                        self.tb.source.u.set_center_freq(target_tx_freq, 0)
                        data_to_be_sent = data[1:]
                        payload = struct.pack('!H', self.pktno & 0xffff) + data_to_be_sent
                        self.tb.txpath.send_pkt(payload, False)
                        self.pktno += 1
                print traceback.format_exc()
                t= 1
As for the reception, a server socket is created in the Click configuration file.
Socket(TCP, localhost, 4001, Client false)
And in the USRP control code, the rx_callback method is modified to be:
    def rx_callback(ok, payload):
        global n_rcvd, n_right, s

        (pktno,) = struct.unpack('!H', payload[0:2])    
        data_recv = payload[2:]

        if s is None:
   print "Data Socket Recv None"
   s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
The assumption here is that each packet is annotated with its packet number (for debugging purposes).