yAlexa Technical Architecture & Update for our Alexa-based prototype

Besides cleaning up our new labs space, we’ve been pretty productive this week and I want to give you a quick update on the progress with Voice User Interfaces. yAlexa (see previous post), our prototype around Hybris as a Service and Amazon Alexa, is taking shape. This week was devoted to adding a demo UI for keeping track of the voice actions directed at alexa. In addition, I’ve created a technical architecture that I quickly wanted to share. Continue reading

Fun with Alexa & Hybris as a Service: yalexa

It’s a shame I’ve not written earlier about this. We’ve got Amazon’s Alexa and also Google Home available at Hybris Labs in Munich, but I’ve had so much other things going on, that I just could not concentrate a lot on this. Today,  I finally had a few hours play a bit more with Amazon’s Alexa. While I need to do more with Google Home, I’ve tried both to some degree now. I find the overall programming and configuration simpler, although Amazon is also trying to totally lock you in of course with the Lambda functions on EC2 – but you have a choice and my choice was to use my own Cloudfoundry based backend and YaaS APIs to implement the business logic. Continue reading

Project X-Ray

Two of our Hybris Labs prototypes need a tag-to-YaaS mapping. Infinite Cart  uses NFC (Near field communication) where the Tag ID is mapped as product code (SKU number). For the Changing Room prototype we are using RFID (Radio-frequency identification) tags. When we started with this prototype you had to hold a RFID tag near an RFID scanner and then had to check the log files to find its ID.

With the RFID Action Reader, which we’re also using for our Expose prototype, you can read the RFID ID on a Raspberry Pi. But I also built a custom made Arduino Shield with an Indy RS500 chip (from Impinj), which sends the the RFID value via USB port. This made life much easier and gave me the idea for Project X-Ray. Continue reading