FB Facts

DIRECT LIFT OF . . . .
THIS

Thank you Gareth Case

Facebook, its growth and success in unprecedented. The 50 facts below highlight the true scale of Facebook and its users’ activity. But will we ever see a communications platform of this size again? What’s going to be the next big thing? Google+ hasn’t had quite the impact I was expecting. If a brand like Google can’t conquer the social networking world then what can? Social media as it is today, in my opinion is saturated, surely only a new breed of media and communications could have an impact big enough to make a dent on the current landscape.

1.1 in every 13 people on Earth is on Facebook
2.35+ demographic represents more than 30% of the entire user base
3.71.2 % of all USA internet users are on Facebook
4.In 20 minutes 1,000,000 links are shared on Facebook
5.In 20 minutes 1,484,000 event invites are posted
6.In 20 minutes 1,323,000 photos are tagged
7.In 20 minutes 1,851,000 status updates are entered
8.In 20 minutes 1.972 million friend requests are accepted
9.In 20 minutes 2,716,000 photos are uploaded
10.In 20 minutes 2,716,000 messages are sent
11.In 20 minutes 10.2 million comments are posted
12.In 20 minutes 1,587,000 wall posts are written
13.750 million photos were uploaded to Facebook over New Year’s weekend
14.48% of young Americans said they found out about news through Facebook
15.48% of 18 to 34 year olds check Facebook right when they wake up
16.50% of active users log on to Facebook in any given day
17.Average user has 130 friends
18.People spend over 700 billion minutes per month on Facebook
19.There are over 900 million objects that people interact with (pages, groups, events and community pages)
20.Average user is connected to 80 community pages, groups and events
21.Average user creates 90 pieces of content each month
22.More than 30 billion pieces of content (web links, news stories, blog posts, notes, photo albums, etc.) shared each month.
23.More than 70 translations available on the site
24.About 70% of Facebook users are outside the United States
25.Over 300,000 users helped translate the site through the translations application
26.Entrepreneurs and developers from more than 190 countries build with Facebook Platform
27.People on Facebook install 20 million applications every day
28.Every month, more than 250 million people engage with Facebook on external websites
29.Since social plugins launched in April 2010, an average of 10,000 new websites integrate with Facebook every day
30.More than 2.5 million websites have integrated with Facebook, including over 80 of comScore’s U.S. Top 100 websites and over half of comScore’s Global Top 100 websites
31.There are more than 250 million active users currently accessing Facebook through their mobile devices
32.People that use Facebook on their mobile devices are twice as active on Facebook than non-mobile users.
33.There are more than 200 mobile operators in 60 countries working to deploy and promote Facebook mobile products
34.Al Pacino’s face was on the original Facebook homepage
35.One early Facebook function was a file sharing service
36.The first “Work Networks” as well as the original educational networks included Apple and Microsoft
37.The meaning of the term poke has never been defined
38.There is an ‘App’ to see what’s on the Facebook cafe menu
39.Mark Zuckerburg (CEO of Facebook) calls himself a “Harvard Graduate” when in fact he didn’t graduate (apparently his reply is that “there isn’t a setting for dropout”)
40.Australian’s spend more time per month on Facebook than any other country at over 7 hours on average
41.A Facebook employee hoodie sold for $4,000 on eBay
42.Facebook was initially bank-rolled by Peter Thiel the co-founder of PayPal for $500,000
43.It is the second biggest website by traffic behind Google (at the moment)
44.Facebook is now valued at approximately $80 billion
45.Facebook makes money through advertising and virtual products
46.Facebook was almost shut down by a lawsuit by ConnectU who claimed that Zuckerburg stole the idea and Technology for Facebook (the issue was settled out of court)
47.The USA has the largest Facebook user base with 155 million people which represents 23.6% of Facebook’s total users
48.There is over 16,000,000 Facebook fan pages
49.Texas Hold’em Poker is the most popular Facebook page with over 41 million fans
50.More than 650 million active users

NOTES ON #PU27

#PU27 was one of the first “Television you can talk too” experiments. A PBS tie-in with the Nova series Making Stuff. Princeton University organized a technology exposition for high school students with exhibitors promoting the idea of studying engineering. It involved doing a live broadcast on the internet using UStream and having viewers direct the program. The twitter hash tag PU27 (#PU27) was used to communicate direction to the production crew and for viewers to discuss the exhibits.

Production:

Video Shooting : iPhone/touch are very difficult to mount and carry while shooting. Less so with the flipped cam mode (self shooting), but still awkward especially for left handed operators. The units were gaffer taped to a glass window for some shots. For oblique angles the unit was rested on a cup or can angled against the corner ledge of the window and then taped from the window frame to the device. In this way the viewfinder/display screen is used to ball park the shot. A laptop pointing to the stream of the iTouch was only partially useful as a viewfinder because of the excessive lag between the camera and the display.

The wideness of the angle is pretty good and focus remains constant though as you would expect depth of field is sacrificed a bit.

Audio: Sound recording is good to excellent when the operator is speaking or the camera is close and in front of the subject. Once the sound originates a further distance away or is a mix of more than one sound source external microphones are needed. Rich has done some research in this are. Once the 4 conductor 3.5 mm plug issue is resolved for input and output to the iPhone/iTouch a mini Shure mixer with conventional mics. is being explored. The 4 ring assignments for Android phones are thought to be different than the iPhone configuration. This should be confirmed. Additionally, less expensive, wireless, high quality audio need more research. Bluetooth enabled units may be something worth exploring

Transmission

The IP application iWebcam was not purchased for this experiment. It or a similar tool is required to maximize limited bandwidth instances.  3.1.11 Got it – not impressed – HUGE lag P-P will retry later perhaps a VLC solution here makes more sense.  Interesting side note – 2 or more iPods (and I assume iPhones) can be cloned along with installed apps. This may not be very practical for a personal comm. device but ideal for production. That is, using two cloned devices, when device one draws down battery, device two can be turned on and pick right up using all the apps, accounts and configurations of of the drained device 🙂

IP connectivity to a laptop would allow for more responsive updating of images so that it can be used as a remote viewfinder. Chicken of the Sea (VNC for Mac) is one way to accomplish this but has not been updated to the newer iOS. Even without robotic articulation (pan/zoom/tilt) remote access to the device via IP could be very useful. A laptop set beside the camera can serve 2 functions. Assuming independent broadcast from the camera (that defeats the bandwidth conservation spoken of earlier) functions like initiating the broadcast, audio mute, and channel selection are more easily done remotely without disturbing the shot composition. A producer, using the laptop, at the site can communicate via text to control to users with supplementary information not seen or heard by the imaging unit.

Without the DSI/Newssharks there was no IP video. IP video is critical for limited bandwidth as only the selected image and audio sources are transmitted to the users.

Control Room

It’s still not clear if the UStream “Pro” producer upgrade allows switching between multiple cameras from non-hardware connected cameras. After Ed discovered the multiple channels on a single Ustream account it seems that the pro version might allow switching much like the crude router we used on the NJN website.

VidBlaster is a software based control room that includes graphics for titling, sound, animation, playlists, screen capture transmission and IP based cameras. Running this application on a Toaster/Tricaster or in a cascading configuration has some interesting implications. Used in combination with Skype, eeVoo, or other conferencing applications and more traditional switching devices the mashup makes for an ad hoc transmission facility with routing along with switching. In other words, more than just a control room in a box – a broadcasting facility in a box. Lessons learned using this equipment will be valuable as more mobile assets are integrated into the traditional plant.

VidBlaster needs a free Adobe Flash encoder to upload to Ustream. I was unable to get it working in this way, but saw that the Ustream Producer recognized it. One way or the other this combo looks like it will work. 3.1.11 Adobe FLASH encoder now working fine and can be sent directly to NJN streaming account. Has also been used in tandem with

Front and back channel communications.

Definition – Front channel communication includes any public means to direct or comment on the live event. One example is a teacher asking a person at a remote site to focus on an object and make a query: “What’s that red thing on the right – can the presenter please explain what it’s used for to us?” Another example is a student commenting on the type of apparatus being used to shoot the image: “Hey, are you guys using an iPhone to shoot this, it looks like my setup?” Yet another example of front channel communication: “ I was at that site earlier and it’s not as cool as it looks, anyone going for burgers after this is over?”

Back channel communication includes private group messaging to update participants on ongoing situations. They may include camera direction, location information, and timing information. This can includes pre-production information – production information – and after production summaries.

Application – Front comm. was only somewhat successfully. The plan was to include tweets from twitter along with ustream chat in a more universal chatting tool, mibbit. This stream of information would be filtered (manually or automatically) with relevant information being routed to the back channel. Cam1 would get the question about the red thing in the example above, but not the comments about the burgers. Funneling all public communication into a single channel may be a bit difficult to follow for the uninitiated, but in time participants should get used to it.

Backchannel communication was not handled well at all. A mish mash of telephone (private and hotlines), email, moot.ly, mail lists and doodle were all attempted with poor results. A common, secure and consistent way to keep all those involved informed is crucial. As we learned this is particularly important when weather is a factor. A scheduling function is a useful feature along with mobile links. Distributing the information to outlook or other calendars along with email notifications as an opt in. The ability to paste directions, pictures, sounds even video along with chat are things we’ll want to have available. Once the right tool is found we must be sure that everyone involved knows how to get to it and can use it. It will also serve as a convenient way to document our issues and solutions. The ability to all look to one definitive source for information is even more critical when we have participants in different groups.

During the event itself we need to be able to communicate to each other easily and with consistency. This has to be mandated by the producing group. We should strive to use a tool that transcends individuals’ tool-sets. Not everyone uses outlook or an iPhone but a message about a change in plans has to be universal.

The original plan called for an individual(s) to parse the direction and questions when tagged with a Cam1(2,3,4) from the pool of information and resend them to the back channel for the producer of the unit to act upon. A well versed camera operator can take direction while maintaining composition, focus and possibly monitoring audio levels. To have them also ask questions and follow action might be too much to ask. Therefore a unit producer will need to carry a device for communication. For smaller or other events a single camera-operator/producer may be able to handle everything but a system that includes acceptable uses will need to be prepared.

Work has begun on building a chatting bridge that filters, messages and supports twitter and other texting platforms along with mibbit. IRC (Internet Relay Chat) is a mature, stable and feature rich environment to develop our application.

Presentation:

The original plan called for remote units to be directed by a centralized control room. The control room would convey direction from viewers watching a single program. It was not the intention of this exercise for the audience to direct a multiple camera shoot of an event, but audience guided coverage of several single camera crews covering different events at a single venue.

While the main “show” was broadcast, the other units would be streaming their audio and video feeds without user direction. This approach was chosen for several reasons. First, the amount of manpower required to parse the direction from general chatter and then relay it to the remote unit was too much. We believed that an on-camera producer was needed at each site and a unit producer to moderate/parse direction from the crowd was also required. These problems could be overcome but it raises some questions. Some users may want to be in complete control of what’s going on. For them, a single event site that allows individual selection of different views of the events would be attractive. Users preferring more passive consumption might want to just see a more traditional treatment of the subject with occasional input to the crew. Still others will just observe with no desire to interact. One other component not included in this treatment is that of participants at the event. Streaming our feed to the attendees’ smartphones and including their participation add another dimension to the event that may engage actual and virtual participants.

Ed suggested another experiment of multiple cameras documenting a single event from different perspectives, a sporting event for example. Individuals equipped with their own camera phones would shoot the action from their seats. The resulting images would be aggregated to a single website with all the cameras visible. The viewer would then be able to select a single angel.

This leads us to the site(s) itself.

The original concept was to have one main landing page. It would serve as an interactive viewing platform. The user sees video of the selected remote crew in one window that can be popped out an repositioned by the user. A second chat window, also detachable, is used for commenting and to relay direction to the producer. There were not enough people to process directions for all the crews. Instead each crew would “freelance” or assume they were independently live until they were called to appear on the main screen. On a second page video from the three crews would be displayed in smaller windows and the user would select the one they wanted. The selected crew would be enlarged and the others would fade back. A corresponding chat window would be joined to each crew for users comments and conversation, but there would be no user direction. Questions with this approach: How to drive the windows with real time updates? Chat components – pop out function – after site – pre site – directions are good – intuitive design is better

Small camera techniques

When we say hand-held camera in television we mean shoulder mounted. Using a smaller camera requires camera support, a wide angle lens, less zooming more static etc.  Let’s start to experiment with the Canon 2Ti – it does 1080 HD and has HDMI out – memory cards (4x16GB?) Instead we use a Lumix FH22

Staple of video is the zoom in/zoom out – pan left/pan right – these techniques are not as effective with fixed lens and non-fluid head camera support. It’s a throw back to early hand held film techniques. Why does Haskell Wexler come to mind?

The 3 levels of video product


1) Broadcast Quality – Primarily, but not necessarily, HD. This is the stuff that is produced by senior craft persons. All measurable audio and video standards are rigidly adhered to. Extreme detail to lighting, sound, frame composition, camera movement, writing, scoring, editing, directing – in short – the works. This level of quality is reserved for long form documentaries, live performances, and specials.

2) Good Quality – Technically, it’s difficult to produce video today that is NOT broadcast quality. This is not a reflection on the lighting, composition, editing or many assumed qualities of the product it is a technical specification RS-170A. The majority of this content is produced in “auto-everything” mode. For the image acquisition phase, Auto focus, auto iris, audio gain control, etc are used. Little technical skills are required to produce this material. This should not to be confused with aesthetic or operational skills. A nicely composed shot that follows an action smoothly and is edited at precisely the right point to convey a message, mood etc. is not “bad”. The idea behind the “good quality” level of product is that it requires a minimal amount of technical processing. The files from the camera go right into the system at full resolution and are edited. They are then completed and distributed. This is the level of quality used for most news packages and various other compilation/TV magazine formatted programming.

3) Acceptable quality – Again, technically this stuff can be broadcast. In addition to almost all auto setting used in the production of this stuff, it may contain poor audio quality, unsteady shots, bad edits, and a host of other irregularities that are the product of the equipment or transport. Think Skype Video from Haiti. The degree with which this content is technically and/or aesthetically acceptable needs to be determined and once the bar is set adhered to. Of course, all bets are off when the content is compelling. Who makes that call? The producer of the vehicle highlighting said content?

Redefining hand held camera work

We’ll talk history later. Suffice it for now to say that,  a language of communicating with images, words, and later sound developed over time. The technology of each era had drawbacks or limitations.  Large heavy equipment combined with very slow acquisition methods resulted in a fixed camera position where the action took place in a single scene. Mise en scène.

We’ll get back to that.

We’ve got this tiny little camera and it needs supporting. We’re going to practice using it to document a conversation. One requirement is that the individual operating the camera be concerned about composition, framing the shot and dynamics, movement within the shot and/or  moving the frame.

Zooming is discouraged. It’s better to physically move the camera to where the point of interest is.

The wider the lens – the less disturbing the motion of the frame is.

Very odd angles, particularly of elevation (pedestal) are sometimes effective. Very high or low.

Long arms are a distinct advantage. Hold the camera out and away from you.  Can you get a 2 shot or just a medium shot of yourself? You will need to be able to ignore the camera while at the same time operating it.  How is this best accomplished?

Pans and tilts usually don’t work well. The motion stabilization handles slow moves better than fast ones, but they ain’t great either.

Placed on a table works for some interviews.

Sometimes, motion isn’t right. Stills are good. They can be used to convey a story just as powerfully as by using motion.