The act of second screening has been around for nearly a decade. It’s an activity many of us engage in without even knowing it; two-thirds of American, internet-connected households second screen.
Second screening is the act of using web-capable device while watching a TV show or movie. That’s it. So why the fuss? This activity has been lauded for years as the next way to differentiate the offerings of content providers. What’s the difference from Verizon FiOS and AT&T U-Verse? Well, aside from pricing and content selection, it’s their second screen experience.
Despite many new mediums coming to market since the original iPhone (the first commercially successful smartphone) including 3D, virtual reality, viable internet video streaming services, and a flood of user-generated content freely viewable by billions, second screening has been lauded time and again as a way to stimulate growth in the stagnant television marketplace.
It’s beginning to sound like The Boy Who Cried Wolf. Remember the Wii U? It was going to be the equally compelling follow-up to Nintendo’s commercially successful Wii console. It was a beefier Wii with a mandatory tablet controller and its sales were horrendous. This is a testament to consumers’ antipathy towards being force-fed a multi-screen experience.
Users instead prefer at-will engagement acting as a supplement to the “big screen” like Netflix’s recently-announced new app feature. This is a restrained approach in that the second screen only initiates when casting from the app (so Netflix is sure the viewer has the second screen in hand).
Netflix isn’t taking a major chance; this is opt-in and hardly more ambitious than grassroots social media, “OMG cant bleve wut just happened on @MadMen!” Time will tell how useful Netflix’s subscribers will find this newest implementation of a tired concept but rest assured, the boy will cry again either way.
Upcoming: Crackle announces revolutionary companion app that integrates with Google+