So-called deep learning algorithms are used by some companies to perform work that has previously been left to the brain, such as recognizing faces in photos, reading words and picking out specific objects in images.
Facebook is now taking a step toward deep learning-based artificial intelligence (AI) with DeepText, a natural language processing engine that can understand the textual content of thousands of Facebook posts per second in more than 20 languages. About 400,000 new stories and 125,000 comments on public posts are shared every minute on Facebook.
Announced this week, DeepText uses several deep neural network architectures, including convolutional and recurrent neural nets, Facebook said in a blog post announcing the technology. It can perform word-level and character-level based learning, as well as model training using Facebook’s FbLearner Flow and Torch.
In the latter case, trained models are served with a click of a button via the FBLearner Predictor platform, which offers scalable and reliable model distribution infrastructure. Facebook engineers can also build new DeepText models through the self-serve architecture provided by DeepText.
Facebook is among several companies, including Google and Microsoft, that have recently started incorporating deep learning algorithms into their software. In April, Google made its AI system, SyntaxNet, available via open source. SyntaxNet uses deep neural networks to read, understand and process human language to glean true, accurate meanings from the words.
Traditional techniques in that area have had to rely heavily on preprocessing logic built on complex engineering and language knowledge. Earlier technology has also struggled with slang, alternative spellings and other variations within languages.
"Using deep learning, we can reduce the reliance on language-dependent knowledge, as the system can learn from text with no or little preprocessing," wrote Facebook’s Ahmad Abdulkader, Aparna Lakshmiratan and Joy Zhang on the company’s blog. "This helps us span multiple languages quickly, with minimal engineering effort."
Context Is Key
Another related challenge for researchers in their quest for AI that actually simulates human communication has to do with how similar-sounding phrases can mean different things. Currently, a major challenge researchers face vis-à-vis creation of truly human-like AI is the inability to create algorithms that can understand subtle nuances in language.
DeepText would not only understand the words individually, but also in the context and intent that surrounds them -- for instance, understanding the difference between the phrase "I need a ride" and "I found a ride" and therefore knowing when the user wants a taxi and when the user no longer needs one, according to Facebook.
"For example, someone could write a post that says, ‘I would like to sell my old bike for $200, anyone interested?’ DeepText would be able to detect that the post is about selling something, extract the meaningful information such as the object being sold and its price, and prompt the seller to use existing tools that make these transactions easier," wrote Abdulkader, Lakshmiratan and Zhang.