Crowdsourcing app could transform how we experience live events
There are a number of fundamental principles used by software engineers when creating new technologies for end consumers. One of the more popular design principles is a concept called “recognition rather than recall.” This principle, which has long been studied by cognitive scientists, advocates that well designed products should reduce users' active memory load so they can focus on the task before them without having to remember details that are often difficult to recall.
In the past, technology-oriented companies have used a variety of design techniques to achieve this desired outcome, from the utilization of menu icons – which are intuitive to remember – to the development of easy-to-recognize color-coding schemes. Recently, because of the convergence of social media platforms and mobile devices, the reapplication of this concept by a company called CrowdOptic has resulted in the creation of a new technology with the potential to create significant value.
The CrowdOptic app works on the premise that groups of individuals at public events, like a college football game for example, will use their cell phones to take pictures and videos of noteworthy events as they happen. Since many different mobile devices – within a small geographic locale – will presumably be focused on the same event, this information serves as a validation measure for the application. Essentially, the company is using a form of crowdsourcing by integrating the images received from the phone with the GPS positioning capabilities of the device along with other pieces of metadata attached to the images to gauge the relevance of the event being observed.
According to Fast Company, the application accomplishes this "by sensing the iPhone's GPS location, compass heading, and time of day to know which object is most likely being viewed through the iPhone screen. It needs at least one other user looking at the same object to triangulate the position." Once the application confirms the location of the subject, it can tell the user, for instance, which band is playing at an event and even which song is currently being played. The metadata collected is also tracked wherever it goes online, which can be incredibly valuable to companies or brands because it teaches them something about their consumers and adds a social aspect that they would not otherwise experience at live events. In the words of Jon Fisher, the CEO, "it's about a shared interest. No hashtags required."
While the company uses proprietary technology to verify this apparent correlation, in common sense terms they are using the presumption that mobs of people tend to focus on the same thing in group settings.
The application of the “recognition rather than recall” principle is therefore integrated into the CrowdOptic technical architecture because users, while watching an event, often don’t have an extensive amount of information available at their fingertips. Rather than force users to recall facts that they don’t know or can’t remember, the application will display information on the image they were just focusing on. In January, this technology was used at the Aircel Chennai Open in India to provide "Focus-Based Services" to fans.
The only downside to this sort of technology, at least in the commercial space, is that we as a culture will finally be able to retire that annoying sports trivia guy – we all know one – as anything he took the time to memorize we will now will have at our fingertips by simply pointing a mobile device in the direction of our interest.
The metadata collected by these sorts of applications can also have considerable value to advertisers. CrowdOptic believes they are creating a new business model that generates mobile ads based not only on geography but also based on the type of content that users personally identify as relevant to their lifestyle. CrowdOptic is not alone in this belief and within the past year they have received $2.5 million in funding to continue developing the technologies behind their products.
CrowdOptic has also touted its technology as a solution in the security space. In a press release, the company says that it "has seen strong demand for security applications of its technology used to enhance crowd surveillance and intelligence during live events" and has signed a deal with Andrews International, a large security and risk-management services provider in Los Angeles. Andrews believes that the technology will allow its personnel to respond to situations more quickly by alerting them to possible danger. It does this by detecting anomalies or disruptions in people's behaviors based on how they are using their mobile devices.
In a more commercial sense, CrowdOptic's technology would integrate very well with several existing technologies. For example, Microsoft's Photosynth, which allows users to create 3-Dimensional images from their smartphones. The application has been used by many high-profile organizations, including CNN's 3D compilation of photos from President Obama's inauguration, NASA's model spacecraft, and MGM's complete virtual tour of "Destiny," the ship from Stargate. Similar technology was used in the creation of Google's Street View and Microsoft uses Photosynth in its own Bing Maps site. If CrowdOptic were utilized alongside Photosynth, Bing could use the crowdsourced images to update its maps in real time without having to send out a fleet of unnerving camera-adorned vans.
Fundamentals, as the term implies, are basic principles with universal appeal. They are typically proven by science and supported by common sense. The continued application of core principles – like recognition rather than recall – ensures a higher probability of success when a company introduces new technology oriented products. Because of this, companies like CrowdOptic can capitalize on the benefits of convergence by complimenting the habits and actions of consumers in a way that seems natural and in alignment with their lifestyle.