CrowdCast

Mobile Web Research

Objective

To develop an application for Citizen Journalism which would:

Motivation

The future of mobile writer and Google's Engineering Director Andy Rubin predicts:

Crowd sourcing goes mainstream: Your phone is your omnipresent microphone to the world, a way to publish pictures, emails, texts, Twitters, and blog entries.

We asked "Why not Videos?" with:

Problem Space and Project Overview

Most of the times citizens are first witness to any event and they might hold the most precise details of an event and hence the key to the truth and accuracy. E.g. in a dispute a group of the people present on the scene would be more accurate than a reporter interpreting their version to present a report which could be biased.

Our aim in this project is not to report whether the story is biased but we are proceeding with a very modest aim of providing people a ubiquitous tool enabling reporting and broadcasting of the events around them.

Mobile technology is developing rapidly and the recent advances on this frontier have made it possible to leverage the multimedia capabilities of these devices for the use beyond traditional purposes. The ubiquity of mobile is accepted throughout the masses and it could be used to capture an event. We intend the crowd to not just capture but transmit the feed live to a web server.

Further we don't want to limit this to a single user view but we want to encompass this to an n-user model where each one of them is able to send the feed and hence provide a multiple view or a diverse perspective of an event.

State of the Art System

Feeds: Videos sent by 'Crowd' through mobile (live or recorded) with location information (GPS coordinates up to street level accuracy) and time stamps.

Events: An Event is a collection of video feeds captured and transmitted (streamed) from several mobile phones. Operation definition: A collection of video feeds with same location and concurrent start times or overlapping start times define an event based on the assumption that any event being streamed will have the same location as well overlapping time frames. Also later tagging and categorization could provide another way of crowd-generated event classification.

Video Server: Receives video feeds from all locations and classify them as live or archive generating XML feed for these two classification. Salient features include encoding uncompressed video feeds into open-source ogg-theora formats. Also provides an option to encode videos into.flv format supporting video streaming over a range of flash players. Also generates JPEG thumbnails for each video. The XML feed also contains location and time information of all the videos.

Prototype Development

Web Application

Built over Drupal Content Management System. Drupal receives XML feeds from video server. Uses the locations and time information to distinguish between events and then displays the events (live and archive) over to the web front-end. We are adding features like tagging, categorization for the users using the web-front end so that the system is moved towards a more collaborative way of generating as well as organizing content.

Mobile Application

J2ME based Mobile App - This feature streams video from mobile to web, live. The MMAPI in J2ME platform which we are using doesn't support live streaming i.e. until a video is not recorded fully it is not sent to the next step(sending it to server). So we are trying to manually break down the stream into smaller chunks and writing a server side code which can handle such a call where video is being sent in chunks rather than stream. But this brings into time-lag into the stream. Apart from that all S60 applications require a Symbian certification. So every time the application access the hardware for recording video access, it prompts user to confirm for the security. Since we are dividing the stream in recorded streams and sending it to the server at particular interval of times (say t seconds), hence user is prompted every t seconds for security confirmation to access video camera. Hence our low fidelity prototype imparts a time-lag due to J2ME limitation.