That’s a trite summation for what’s happening, but according to Gary Marcus, the CEO and co-founder of Robust.AI, AI developers and researchers will need to augment their approach before any real progress towards “robust” artificial intelligence can be made. Read: UK plan to replace migrant carers with automation branded ‘ridiculous’ Marcus published a new paper on arXiv earlier this week titled “The Next Decade in AI: Four Steps Towards Robust Artificial Intelligence.” In the 55-page document he sums up and expands upon his recent arguments during the 2019 “AI Debate” between himself and Yoshua Bengio. The gist of what Marcus is saying is summed up in a single quote he attributes to members of the Facebook AI team: In other words, like a chicken playing tic-tac-toe, AI doesn’t have the slightest clue what it’s doing. It’s just modifying and repeating whatever it was programmed to do until a human decides the “parameters” for its behavior are properly adjusted. Marcus argues that AI has no actual understanding because it doesn’t have an internal model of the world and how it and the objects in it function as humans do. The prescription, he says, is a hybrid developmental paradigm that combines deep learning with a cognitive model approach. He writes: This approach is a departure from the current pie-in-the-sky efforts of numerous startups, big tech companies, and organizations who’ve dedicated their work to creating “Artificial General Intelligence,” or super-human AI. Marcus, instead, advocates for a developmental restructuring that incorporates an achievable middle-ground involving the “next level of AI” before we get to the far-off age of superintelligent machines. To this end, he writes: The meat of the problem is that deep learning is not a very good approximation for human reasoning. Anyone who’s ever fumbled through several different commands before landing on the right one to “trigger” the proper response from a smart speaker has dealt with AI’s inability to “understand.” When Google Assistant or Alexa fails to process a command that makes sense but doesn’t use the right phrasing, it’s reacting no differently than if we’d pushed the wrong button on a touch pad: there’s no sense or intelligence there. We’ve said before that most AI is either just an output funnel for vast amounts of data or prestidigitation akin to a magician making it appear as though they’d pulled a robot out of their hat. The truth is that Alexa, that GPT-2 text generator everyone’s scared of, and Telsa’s Autopilot system are all one-trick ponies. Even Deep Mind’s AlphaGo, the computer that beat the world’s greatest game players at, arguably, the world’s toughest game, would get its ass kicked in a game of Monopoly or Scrabble unless someone took the time to completely retrain it. Marcus insists that we need “an intelligence framed around enduring, abstract knowledge” if we’re to move artificial constructs forward toward human-level reasoning. Throughout history there are tales of scientists gleaning inspiration from unrelated events – Newton supposedly pondered gravity after wondering why apples fell straight down and Velcro was allegedly invented after an engineer went hiking and got cockle-burrs stuck to their pants. The point is, AI doesn’t have inspiration or the ability to gather abstract information for unspecified distribution across future learning domains. And, until it does, we’re pretty far away from having “robust AI,” and much, much further from “human-level” or “superintelligent” machines. For more information read the full paper on arXiv here, and check out “Rebooting AI” by Gary Marcus and Ernest Davis. You’re here because you want to learn more about artificial intelligence. So do we. So this summer, we’re bringing Neural to TNW Conference 2020, where we will host a vibrant program dedicated exclusively to AI. With keynotes by experts from companies like Spotify and RSA, our Neural track will take a deep dive into new innovations, ethical problems, and how AI can transform businesses. Get your early bird ticket and check out the full Neural track.