NOTES ON #PU27

#PU27 was one of the first “Television you can talk too” experiments. A PBS tie-in with the Nova series Making Stuff. Princeton University organized a technology exposition for high school students with exhibitors promoting the idea of studying engineering. It involved doing a live broadcast on the internet using UStream and having viewers direct the program. The twitter hash tag PU27 (#PU27) was used to communicate direction to the production crew and for viewers to discuss the exhibits.

Production:

Video Shooting : iPhone/touch are very difficult to mount and carry while shooting. Less so with the flipped cam mode (self shooting), but still awkward especially for left handed operators. The units were gaffer taped to a glass window for some shots. For oblique angles the unit was rested on a cup or can angled against the corner ledge of the window and then taped from the window frame to the device. In this way the viewfinder/display screen is used to ball park the shot. A laptop pointing to the stream of the iTouch was only partially useful as a viewfinder because of the excessive lag between the camera and the display.

The wideness of the angle is pretty good and focus remains constant though as you would expect depth of field is sacrificed a bit.

Audio: Sound recording is good to excellent when the operator is speaking or the camera is close and in front of the subject. Once the sound originates a further distance away or is a mix of more than one sound source external microphones are needed. Rich has done some research in this are. Once the 4 conductor 3.5 mm plug issue is resolved for input and output to the iPhone/iTouch a mini Shure mixer with conventional mics. is being explored. The 4 ring assignments for Android phones are thought to be different than the iPhone configuration. This should be confirmed. Additionally, less expensive, wireless, high quality audio need more research. Bluetooth enabled units may be something worth exploring

Transmission

The IP application iWebcam was not purchased for this experiment. It or a similar tool is required to maximize limited bandwidth instances.  3.1.11 Got it – not impressed – HUGE lag P-P will retry later perhaps a VLC solution here makes more sense.  Interesting side note – 2 or more iPods (and I assume iPhones) can be cloned along with installed apps. This may not be very practical for a personal comm. device but ideal for production. That is, using two cloned devices, when device one draws down battery, device two can be turned on and pick right up using all the apps, accounts and configurations of of the drained device 🙂

IP connectivity to a laptop would allow for more responsive updating of images so that it can be used as a remote viewfinder. Chicken of the Sea (VNC for Mac) is one way to accomplish this but has not been updated to the newer iOS. Even without robotic articulation (pan/zoom/tilt) remote access to the device via IP could be very useful. A laptop set beside the camera can serve 2 functions. Assuming independent broadcast from the camera (that defeats the bandwidth conservation spoken of earlier) functions like initiating the broadcast, audio mute, and channel selection are more easily done remotely without disturbing the shot composition. A producer, using the laptop, at the site can communicate via text to control to users with supplementary information not seen or heard by the imaging unit.

Without the DSI/Newssharks there was no IP video. IP video is critical for limited bandwidth as only the selected image and audio sources are transmitted to the users.

Control Room

It’s still not clear if the UStream “Pro” producer upgrade allows switching between multiple cameras from non-hardware connected cameras. After Ed discovered the multiple channels on a single Ustream account it seems that the pro version might allow switching much like the crude router we used on the NJN website.

VidBlaster is a software based control room that includes graphics for titling, sound, animation, playlists, screen capture transmission and IP based cameras. Running this application on a Toaster/Tricaster or in a cascading configuration has some interesting implications. Used in combination with Skype, eeVoo, or other conferencing applications and more traditional switching devices the mashup makes for an ad hoc transmission facility with routing along with switching. In other words, more than just a control room in a box – a broadcasting facility in a box. Lessons learned using this equipment will be valuable as more mobile assets are integrated into the traditional plant.

VidBlaster needs a free Adobe Flash encoder to upload to Ustream. I was unable to get it working in this way, but saw that the Ustream Producer recognized it. One way or the other this combo looks like it will work. 3.1.11 Adobe FLASH encoder now working fine and can be sent directly to NJN streaming account. Has also been used in tandem with

Front and back channel communications.

Definition – Front channel communication includes any public means to direct or comment on the live event. One example is a teacher asking a person at a remote site to focus on an object and make a query: “What’s that red thing on the right – can the presenter please explain what it’s used for to us?” Another example is a student commenting on the type of apparatus being used to shoot the image: “Hey, are you guys using an iPhone to shoot this, it looks like my setup?” Yet another example of front channel communication: “ I was at that site earlier and it’s not as cool as it looks, anyone going for burgers after this is over?”

Back channel communication includes private group messaging to update participants on ongoing situations. They may include camera direction, location information, and timing information. This can includes pre-production information – production information – and after production summaries.

Application – Front comm. was only somewhat successfully. The plan was to include tweets from twitter along with ustream chat in a more universal chatting tool, mibbit. This stream of information would be filtered (manually or automatically) with relevant information being routed to the back channel. Cam1 would get the question about the red thing in the example above, but not the comments about the burgers. Funneling all public communication into a single channel may be a bit difficult to follow for the uninitiated, but in time participants should get used to it.

Backchannel communication was not handled well at all. A mish mash of telephone (private and hotlines), email, moot.ly, mail lists and doodle were all attempted with poor results. A common, secure and consistent way to keep all those involved informed is crucial. As we learned this is particularly important when weather is a factor. A scheduling function is a useful feature along with mobile links. Distributing the information to outlook or other calendars along with email notifications as an opt in. The ability to paste directions, pictures, sounds even video along with chat are things we’ll want to have available. Once the right tool is found we must be sure that everyone involved knows how to get to it and can use it. It will also serve as a convenient way to document our issues and solutions. The ability to all look to one definitive source for information is even more critical when we have participants in different groups.

During the event itself we need to be able to communicate to each other easily and with consistency. This has to be mandated by the producing group. We should strive to use a tool that transcends individuals’ tool-sets. Not everyone uses outlook or an iPhone but a message about a change in plans has to be universal.

The original plan called for an individual(s) to parse the direction and questions when tagged with a Cam1(2,3,4) from the pool of information and resend them to the back channel for the producer of the unit to act upon. A well versed camera operator can take direction while maintaining composition, focus and possibly monitoring audio levels. To have them also ask questions and follow action might be too much to ask. Therefore a unit producer will need to carry a device for communication. For smaller or other events a single camera-operator/producer may be able to handle everything but a system that includes acceptable uses will need to be prepared.

Work has begun on building a chatting bridge that filters, messages and supports twitter and other texting platforms along with mibbit. IRC (Internet Relay Chat) is a mature, stable and feature rich environment to develop our application.

Presentation:

The original plan called for remote units to be directed by a centralized control room. The control room would convey direction from viewers watching a single program. It was not the intention of this exercise for the audience to direct a multiple camera shoot of an event, but audience guided coverage of several single camera crews covering different events at a single venue.

While the main “show” was broadcast, the other units would be streaming their audio and video feeds without user direction. This approach was chosen for several reasons. First, the amount of manpower required to parse the direction from general chatter and then relay it to the remote unit was too much. We believed that an on-camera producer was needed at each site and a unit producer to moderate/parse direction from the crowd was also required. These problems could be overcome but it raises some questions. Some users may want to be in complete control of what’s going on. For them, a single event site that allows individual selection of different views of the events would be attractive. Users preferring more passive consumption might want to just see a more traditional treatment of the subject with occasional input to the crew. Still others will just observe with no desire to interact. One other component not included in this treatment is that of participants at the event. Streaming our feed to the attendees’ smartphones and including their participation add another dimension to the event that may engage actual and virtual participants.

Ed suggested another experiment of multiple cameras documenting a single event from different perspectives, a sporting event for example. Individuals equipped with their own camera phones would shoot the action from their seats. The resulting images would be aggregated to a single website with all the cameras visible. The viewer would then be able to select a single angel.

This leads us to the site(s) itself.

The original concept was to have one main landing page. It would serve as an interactive viewing platform. The user sees video of the selected remote crew in one window that can be popped out an repositioned by the user. A second chat window, also detachable, is used for commenting and to relay direction to the producer. There were not enough people to process directions for all the crews. Instead each crew would “freelance” or assume they were independently live until they were called to appear on the main screen. On a second page video from the three crews would be displayed in smaller windows and the user would select the one they wanted. The selected crew would be enlarged and the others would fade back. A corresponding chat window would be joined to each crew for users comments and conversation, but there would be no user direction. Questions with this approach: How to drive the windows with real time updates? Chat components – pop out function – after site – pre site – directions are good – intuitive design is better

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s