Natural language interfaces are becoming more and more common but are extremely difficult to build, to maintain, and to port to new domains. NLCI, the Natural Language Command Interpreter, is an architecture for building and porting such interfaces quickly.
NLCI accepts commands as plain English texts and translates the input sentences into sequences of API calls that implement the intended actions. At its core is an ontology that models the API that is to be used. Then a natural language understanding pipeline analyzes the English input and generates source code. The analyses are independent of a particular API; switching
to a different API only requires provision of a new ontology.
In this demonstration we show how a developer can provide a natural language interface for his or her API by preparing an API ontology. We also show how NLCI analyzes the input text, how we evaluated its results, and how well it performs. As an example we use an API that steers a Lego EV3 robot.