Fake news and tech collided in New Orleans. Here's what we learned
My business partner, Alex Vogel, and I were involved in some sort of collision in New Orleans this week. Some 10-ought thousand(s) like-minded entrepreneurs and tech CEOs were crammed into a convention hall. A lot of words were exchanged and traded and reconsidered and mixed and mingled. Investors danced with Alphas. The result was something of a metamorphic knowledge factory, yielding more than a few insights into how Grapple's forthcoming search concept.
- Advanced search solutions are here. Atlas, a Seattle-based startup, was exhibiting their context-based search tool. Atlas runs as a browser extension that indexes the extent of your web history. It can use contextual search protocols, in addition to keywords, to provide an almost photographic memory of what you look at on the leg. Try that red t-shirt I saw yesterday, or that company I looked up when I was at the airport. It's all there, but never before has it been so easy to find. Impressive stuff.
- Fake News is a hot topic among the tech community, but tools immediately available to help the task of finding and culling the worst falsehoods are few and far between. We heard several forums about fake and deceptive news at Collision, featuring media figures John Avlon, editor-in-chief at The Daily Beast, Jared Grusd of the Huffington Post, Adam Singolda of Taboola and Walter Isaacson, formerly of CNN. Speakers focused on public education strategies, including those underway by web platforms Facebook and Twitter. Solutions discussed revolved heavily on empowering news audiences with knowledge about potentially fake or misleading news sources. A European software startup, Factmata, is promising sophisticated artificial intelligence algorithms to seek out fake news ahead of June elections in the United Kingdom, but it's unclear how individuals will access or interact with their software.
- Stephen Wolfram, the award winning computer scientist most widely known for his Wolfram Alpha search tool, discussed the potential to teach computers to find fake or misleading news content. I had the chance to speak directly with Mr. Wolfram during and after a Question and Answer session at Collision. Wolfram suggested that machine learning datasets could be used to pick up on various tells within content. An example of such tells would be the extensive use of adjective strings among marketing professionals. “For example, if I see something written as this, this, and this, I can tell that whatever I’m reading was probably written by someone with a marketing degree,” Wolfram said. Using neural networks to absorb such habits could help software like Grapple’s identify more sophisticated clues about the reliability of written content, perhaps even better than contextual elements could. Surely some of Wolfram's ideas are under consideration in development circles and could someday be paired with Grapple's search tools.
- There is great interest among online content providers for improved quality control solutions on the production end. During the week-long event in New Orleans, I had conversations with several news executives at both national and local news outlets. Invariably, all saw commercial potential for a software tool to prevent errors in journalism and online publication. Quality control is among the top unsolved problems among news circles, where many of the human quality control departments have been decimated in recent years.
We will exhibit our technology more extensively next month, during a three-city European tour. Grapple has accepted invitations to attend June technology investment conferences in Krakow, Paris and Vienna. We are aiming to have a viable prototype in action by July. Stay tuned, and thanks for your support!
p.s., for those who needed it, here's proof. John enjoyed some beignets and some particularly potent gumbo served at a certain residence. Thus, this photo is going at the bottom: