Hybris Labs in 2016 – The Year of IoT

Parallel to the SAP Hybris Digital Summit  2017 we want to take a look at the past year through the eyes of Hybris Labs. We decided to name our 2016 “The Year of IoT”. The very first Labs IoT prototype does of course date back to 2014 and came in the shape of the original Smart Wine Shelf. But what we like to refer to as the “mastery of IoT” involves achievements such as the replication and adaptation of our prototypes, enabled through YaaS, the distribution of our demos to events across the globe, and the very first Hybris Labs prototype with an SAP Hybris customer. Continue reading

In-Store Targeting and Analytics – on YaaS!

It’s finally time to write about a new project we’re working on. Hopefully this also helps to clear up a few open issues we’re still working on. So here’s some news about a project we’ll probably name “bullseye”. To some degree it is an extension of the wine shelf. But it’s super flexible in terms of configuration and products. And – boom – it’s almost 100% based on YaaS, the new hybris commerce APIs.

Architecture, rough… 

This architecture is rough and can change any moment, but it’s a good ground to describe what this is about. The idea itself – again – is about selecting products in the physical retail space. And also about providing feedback to the retailer about physical interactions with products. YaaS plays a big role as we use the YaaS Builder Module system to edit all the configuration of the system. We’ve also written our own YaaS service, that provides the product matching logic in a completely tenant-aware fashion.

Bullseye - plat Technical Architecture (1)

Platforms and Bases = Smart Shelf

From a technical perspective, the hardware used is less impressive. It’s really not the focus this time. We’ve worked on a 3D-printable design that contains the electronics for the hardware parts of this prototype. Each of the platforms below (so far we have about 20 fully working platforms) contains a microcontroller for the logic, a large 24 NeoPixel LED ring (output) and a LDR (light dependent resistor, input). The platforms connect via Micro-USB to a base (power, serial data), which most likely will be a Raspberry PI again. In between, we need standard USB 2.0 hub, as  a Raspberry PI has only 4 USB ports and we would like to power as many as 20 or 30 platforms from one base. Check out some images below.

IMG_20151210_101416 IMG_20151210_103701
IMG_20151021_153508 IMG_20151210_103708

The firmware that runs on the platforms is able to receive a few commands over a custom serial protocol. Via this protocol, we can change the identity of the platforms (stored in EEPROM), read the sensor value or issue a light effect command (e.g. turn all pixels on, turn them red). It’s a fairly low-level, basic, communication protocol. The only business-level logic that so far still runs on the microcontrollers is the calculation of liftup times. We count the duration between the increase of light (product lifted) and the decrease of light (product down). To not interfere with the NeoPixel (light) ring, we’re blocking the event calculation during the light effect execution.

The bases, most likely Raspberry PIs, each have a unique ID. The platforms, again, have unique IDs. Via MQTT (node.js using MQTT Client Software) we can issue commands to the bases and to the platforms directly.

MQTT Broker

An important architecture component that we can’t live without is the MQTT broker. Due to port restrictions and other technical issues, this part is currently outside of the YaaS cloud. For now, the bases connect to the broker to connect the platforms over serial. The bases subscribe to MQTT topics that match the platform ids. They also subscribe to a base-level topic, so we can send base-wide commands. If a platform disconnects from a base, we unsubscribe from the MQTT topic of that platform. This ensures that the communication bandwidth required is lightweight.

YAAS Builder Module

The builder module that you get once you subscribe to our package in the YaaS Marketplace allows you to configure the physical mapping and the questionnaire that the end-user finally gets to see. The products derive from the products you’ve configured via the YaaS product service. Below are a few honest screenshots, before we even started styling these screens (be kind!).

As a user, you’ll first have to choose a shelf, which is identified by the id of the base. Next, you choose which product category you’re creating the recommendation system for. All products of the shelf need to adhere to a common set of attributes, hence the category. Third, you’ll assign the products of that shelf/category combination to platform IDs. Finally, the scoring configuration – which questions, which answers, which score per correct answer is specified. The scoring configuration is the key ingredient to the end-user questionnaire form. Once all four steps are completed, the retailer is given an end-user URL that can be turned into a shelf-specific QR code (or put onto an NFC tag, or put onto a physical beacon or shortened and printed, etc.).

Screen Shot 2015-12-10 at 11.37.10 AM Screen Shot 2015-12-10 at 11.37.13 AM
Screen Shot 2015-12-10 at 11.37.17 AM Screen Shot 2015-12-10 at 11.37.35 AM

YaaS Matching Service

Our matching service is triggered by a special URL that goes through the YaaS API proxy. All requests and bandwidth is counted and can later be billed. The end-user experience begins with a rendering of the questionnaire. The user chooses his answers and sends the data off to the matching service. The matching service now pulls the scoring configuration, the products and the mapping to calculate the matches. Based on the relative threshold, we calculate which products and therefore physical platforms are highlighted. Now, MQTT messages are sent out to the bases/platforms to highlight the appropriate platforms.

Screen Shot 2015-12-10 at 1.08.54 PM  Screen Shot 2015-12-10 at 1.09.03 PM

Once a customer uses the system via a questionnaire, the shelf belongs to her for the next moments. This means we block access to the tenant/shelf combination for some time. During that time, the user is interacting in a personalized session with the shelf. Lifting up a product results in the display of detailed information directly on the customer’s tablet or smartphone. And of course, it fuels a few analytics displays that still need to be detailed.

What’s next? tons of work ahead.

We’re working hard on the specs for the initial version of this prototype and some sample products, categories, configuration that we’ll use for the hybris customer and partner days in Munich 2016 (early February 2016). But we’re also thinking of a few extra features that might make it into the prototype by then: for example, we’re thinking of a stocking mode, in which the platforms highlight one after each other and the screen shows you the product that needs to be on. It helps both the labs member to setup a demo as well as the retail employee to stock a shelf. And we’re thinking of sending the recommended products via email. A customer could then continue the shopping at home which a pre-filled cart.

Got ideas? Let us know. This is the time to provide input!

IoT Workshop with Particle Photon

Just a quick update from an internal IoT event here at hybris. Today, we ran our IoT Workshop for the first time and had a hell of fun. We used our carefully crafted IoT Experimentation Kits which included a Particle Photon to educate 15 people about the internet of things. And yes, we’ve connected our first buttons to the YAAS PubSub Service, which was a lot of fun!

Below, you see the box that every participant received, plus a few other pics documenting the fun. At the beginning, we had quite some trouble to get some devices online. We bricked three devices, but in the end everybody was happy and got some YAAS buttons/LEDs connected.

IMG_20150922_092033

We’re now looking forward to seeing some cool creations over the next couple of weeks! Here are a few more impressions:

IMG_20150922_114951 IMG_20150922_114937 IMG_20150922_120348

IoT with Arduino Yun and YaaS

Arduino Yun IoT

The Arduino Yun hooked up to an LDR light sensor

Inspired by Georg’s post about connecting the ESP8266 to the upcoming hybris as a Service (YaaS), I thought it would be great to connect an Arduino microcontroller to the YaaS platform to showcase an Internet of Things (IoT) scenario. To keep this proof of concept (PoC) small, I decided that fetching an OAuth2 token and posting a sensor value to the YaaS Document Repository would be a good start.

The Arduino however does not have any out of the box capability to send data to the Cloud. There are many modules (or shields) which are able to connect the Arduino to the Internet, including Ethernet, WIFI or using Bluetooth Low Energy (BLE) through a gateway. Due to the Arduino computing and memory constraints, almost all of those connection options cannot levarage security layers such as HTTPS or TLS. The hybris YaaS platform though requires data to be sent using HTTPS which does not leave a lot of options for secure IoT with the Arduino.

While looking into different options I saw that we had an Arduino Yun lying around in the office and so I decided to use it for the PoC. The Arduino Yun is a hybrid microcontroller board which includes a full-blown Linux System-on-a-Chip (SoC) as well as the same AVR chip found on an Arduino Leonardo. The Arduino IDE includes a Bridge library which lets the Arduino microcontroller talk to the Linux SoC over a USART serial connection.

Instead of implementing an Ethernet and/or WIFI driver plus TCP/IP and HTTP for the limited AVR microcontroller, the Arduino team created a lightweight wrapper for the CURL command line HTTP client which is called over a serial bridge on the Linux SoC.

Unfortunately, this library currently implements HTTP and not its secure variant HTTPS nor sending POST requests. In order to make web service calls to YaaS, I had to implement my own little wrapper around CURL for fetching YaaS tokens and for sending secure POST requests to the YaaS Document Repository.

The sensor that I am using is an LDR (Light Dependent Resistor) which is connected to the Arduino’s Analog-Digital-Converter (ADC) port A0 using a voltage divider circuit. The circuit was set up on a breadboard and connected to the Arduino Yun.

arduino_ide_screenshot

YaaS client library in the Arduino IDE

I am using the Arduino String library to construct the JSON strings for the HTTPS request bodies. The Arduino Yun YaaS library I created only implements the following features: requestToken(), securePostRequest() and a super simple jsonStringLookup() to parse the token from the JSON response. Packing this functionality into 32kb of program memory and 2kb or RAM of an Arduino was a real challenge. When running out of RAM on the microcontroller, things will just stop working without any warning. The Arduino IDE only offers serial messages for debugging.

arduino_serial_screenshot

Successfully requested OAuth2 token and uploaded a sensor value to the YaaS document repository

In the process of writing the YaaS HTTPS library, I realized that doing requests by wrapping CURL requests does not provide a great level of flexibility when it comes to error handling and retries of requests. There are also some drawbacks to having a Linux SoC on the same microcontroller board: the difficulty of battery operation due to its power requirements or keeping a full-blown Linux system maintained and secure over time.

With my Arduino Yun PoC I have proven that it is possible to connect an Arduino to our new microservices based cloud platform YaaS. I learned that doing HTTPS requests on an 8-bit microcontroller is only feasible if you are using a more powerful gateway such as the Linux SoC included in the Arduino Yun. As a next step, it would be great to be able visualize the sensor data which I am pushing to the document repository. That’s a story for another blog post though.

Max, what are you soldering there?

“Ouch! (his fingers obviously…hihihi…) I’m just building a prototype board to connect the TV screens in the labs space to a logical control system which we can then use for IoT purposes. And to connect this we need a special adapter, that’s what I’m just soldering.

IMG_2684 1

“How does that work? Not the soldering, the system.”

“It’s basically just a serial adapter which uses an Ethernet connector instead of a normal RS232 connector. And because we need something like this for a Raspberry Pi and we don’t want to stack three adapters in a row, we’re building it on a circuit board ourselves.”

“Tell us a bit more about why you’re doing this.”

“Our existing video solution in the labs space isn’t really satisfying with the function it provides. We want to have a video system that allows us to control all the time which video is playing so that we can use the system to display events generated by other IoT prototypes, e.g. Funky Retail. If a customer’s presence is detected,Funky Retail would normally just light up and a video would play. And this is all static, right? Meaning, the Raspberry Pi that is built into the Funky Retail System would then play a video. And what we could do is use this event ‘customer presence’ to trigger some action on any of the screens in the labs space. It doesn’t nee to be a simple video, it could be anything. Things like: you walk over from one screen to the next and the video follows you with the right seek position in the video itself.”

“How exactly did you solve the problem you had connecting to the TVs?”

“That’s what I’m doing right now. The thing is, each TV needs two connections. One is HDMI, which is no problem, and the other is a serial connection which we can use to control the screens we already have in the labs space. We could use some other technology to control the TVs, but the ones we have don’t support this technology. It’s called HDMI CEC (Consumer Electronics Control) and it only exists in consumer TV screens, but we have professional screens. That’s why we need to use the professional controlling option which is a serial connection. So we need two cables to each screen. And what I’m soldering right now is the adapter for a Raspberry Pi to control each screen. And this is done via an Ethernet cable. What happens is, each Raspberry Pi will have an HDMI connection to a screen and a serial connection. Then we’ll have one controlling Raspberry Pi for each screen.”

“And at the end we can control the screens from wherever we want through the internet?”

“Yes, that’s the point. The Raspberry Pis will then connect to a central broker system which is basically just a server on AWS. … Damn, too short! (I think I might be distracting Max here a bit…)
So, I’m just solving the hardware problems right now. The stories or the applications we can build based on this are much more interesting than the hardware. This will be included into other prototypes as an output source. We have a lot of input sources in our IoT prototypes, e.g. pick up events, presence event, etc., but most of our output events are flashing lights and web UIs. If what I’m building here works, we could utilise any TV screen that is totally independent from our other prototypes as an output. That’s the idea and we’ll have the applications pretty soon. The easiest one is just to replace the static environment for Funky Retail, which already plays a video when you pick something up, and uncouple the playback of the video from the controlling Raspberry Pi of Funky Retail. This totally makes sense.”

Thanks Max, hopefully we can celebrate the success soon! And be careful with that soldering iron…