NHI and Intelligent Agents


, , , ,

Why NHI?

The philosophy behind Natural Human Interfacing (NHI) is that a combination of language, device integration, and AI awareness can streamline all manner of software interfaces to reduce touch points, afford more exacting control, and provide intuitive human interaction with applications incorporating AI.

Language input can achieve single-sentence commands or prompts that encompass vast, complex functions. Language input can reduce operating complexity: less key and button presses, form fields, and mouse clicks needed to perform a task.

1980’s Interfaces in the 2020’s

If you’re dealing with an AI that can pass the Turing Test, then why is it still operated by command line? It’s like using a buggy whip to drive a Tesla. The technology is ready for live use, it should be a conversation, not Wuthering Heights in Semaphore.

Users and Creators should be able to approach AI tools from any direction in order to maximize their deployment flexibility. Maybe you do just want to use the stdin and stdout of the console, but being able to talk to an application hands-free, or blind achieves less obtrusive working conditions. Only touch it when you need to.

NHI: UX At A Distance

The benefits of hands-free interaction with intelligent agents that can execute computational tasks, display media, provide needle in a haystack signal search, cognition and relay as well as log and analyze ongoing interactions are multitude across a variety of applications, professions and industries.

Take, for example, NHI in Education:

Education can benefit from NHI fulfilling many of the primary interactions a human teacher provides. An NHI-driven intelligent agent educator can interact with the student via speech I/O, display relevant media, write on a virtual blackboard during lectures to illustrate points, offer tests and evaluate and guide the student in real time.

An NHI-driven intelligent agent educator can be especially useful to handicapped and special needs students. For the blind there is an obvious advantage to tailoring the NHI lesson interactions to speech and sound media. For the deaf, an avatar that can interpret text into sign language, as well as use pose detection for translating student sign language and gestures into command and control, query prompting, and question answering, beyond being tuned to offer visual media and text for the lesson at hand.

The same “brain’ directing and evaluating the lesson should be able to present the same curricula and achieve the same level of knowledge domain testable results, advance through lesson by lesson at the students pace shifting seamlessly between interaction modes as required to have the student understand the materials at hand.

Or, NHI in Industry:

Portability and continuity of systems are critical in situations where a device can not be networked; nothing is guaranteed in the field. Worrying about connection latency is the last thing you want to do during an outage or failure, much less at the bottom of the ocean or in space, when your hands are already full.

NHI intelligent agent-assisted field repairs, parts identification and replacement, voice-driven hands-free calculation results, sensor data analysis, functional command and control can all be augmented via voice, gesture, image and video stream I/O. The inference engine and command functionality are knowledge domain-specific, trained for that particular set of tasks, and loaded onto the device.

Just Act Natural

NHI should be a core set of abilities available to a “brain” driving AI interactions with humans. If intelligent agents are to be assistants and savants that humans are to utilize as tools to accomplish tasks for the betterment of the human race, it is an obvious postulate that these agents should be conversant in real human interaction. You wouldn’t send R2-D2 to teach grammar, C-3PO is clearly the better pick.


Leave a Reply