Humanising Sift : Part 1

I am constantly on the look-out for inventing the most engaging customer interaction methods. On that pursuit, we probably are the first in the world to voice enable the Inbound Offer APIs and thereby delivering a rich and refreshing man-machine engagement. In this theme, we first focussed our attention on Amazon’s Alexa, which is one of the popular voice assistants. The architecture involves the following 3 layers that communicate with each other in that order

  1. Amazon’s Echo, a device that listens to/answers the human
  2. A backend application hosted on Amazon’s infrastructure that manages the conversational flow
  3. Sift Online Server which serves the real time contextual offer upon an inbound request

The Echo device has a very decent voice to text capability. At its base, it offers a good integration with home automation devices and enables voice commands to operate them. As a smart extension, it can connect to Amazon’s backend (skills) on specific voice command requests. This is where the layer 2 comes in. It provides a very constrained but a non-fuzzy natural language capabilities to infer the commands and muster the responses. This layer can also invoke an external webservice within the conversation, which is the layer 3, a Sift Inbound service.

The demo video is posted here and its real fun.

Share This Post
Have your say!
2 0

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>