hybris Americas Customer Days, part 1 – Bringing back the Funk

There was so much going at this year’s hybris Americas Customer Days 2015, we’ll make trilogy out of the story. In part 1, ‘Bringing back the Funk’, we’ll talk about the setting up of our demos. Not a piece of cake this time…

Upon arrival in Fort Worth, Texas everything looked fine. Our booth was being set up according to our wishes. “This goes there please”, “power here please”, “a hole in that wall please”, “Ethernet here, here and here please”, “WiFi please”, “more WiFi please”, “more faster WiFi please”,… the usual stuff. Totally new to us was the needlessness of wine bottle re-labelling. Very confusing, because that usually was the first action item after checking into an hotel in the US. The schedule looked comfortable, since we had arrived on the Sunday afternoon with the event not starting until Monday evening.

Monday morning: firing up and testing the demos. Moto, The Changing Room and Funky Retail were the prototypes on the list. It was the latter that caused us serious headaches this time. The photos tell the story…

IMG_3102 1

IMG_3100 1   IMG_3106 1 IMG_5854 1 IMG_5858 1IMG_3103 1

Yes, we had to re-solder everything from top to bottom. Just to find out that the short circuit that had caused the damage, had also fried the Arduino. Unlike pressure and proximity sensors, this isn’t a spare part we carry around with us. Ironic isn’t it? We’re always talking about simplifying e-commerce, but in this case the only option we had was to get into taxi and drive an hour to the next suitable shop. Unfortunately we only found an inferior replacement. At the end, Georg and Lars managed to get three of the five Funky Boxes up and running. Average time to set up Funky Retail: 30 min; set up time in Fort Worth: 27 h…

Oh, we also showed The Changing Room and Moto. They worked though. Eventually… To be absolutely honest, there was a period of time during which none of the labs prototypes was working. Let’s call this ‘exciting’…

IMG_3109 1

Moto – "It’s so simple, even Nick can run it…”

Thanks Lars… Have you finished playing with Minecraft?…

So, what is Moto? Moto is the fourth prototype of our IoT series and again we’re shifting the focus of what we want demonstrate. When we built the Smart Wine Shelf, we concentrated on the customer experience. With Funky Retail we focused more on the analytics. Tiles was a step closer towards exploring the technological aspects around mobility. And now finally, with Moto we’re diving even further into the IoT technology and the possibilities it leverages to permanently reconfigure the prototypes functions.

IMG_2566[1]

The ‘active’ physical components of a Moto are a distance sensor, a turning platform, and a LED-ring. Moto connects to an Android app via BLE and uses the device as data hub. The components and their actions can be connected in any way that seems sensible, by the means of a programming tool called ‘Node-RED’. And exactly this the essence of Moto. ‘Node-RED’ allows users without special expertise in coding and programming (like Nick…) to configure an IoT-based system. The actual Moto and the actions taking place merely serve as an example. These actions are displayed in a web page UI through which they also can be triggered.

We’re deliberately not telling a specific business story around this prototype. Basically it’s a bit of plug & play for IoT.

Screen Shot 2015-06-05 at 4.08.13 PM Screen Shot 2015-06-03 at 11.12.29 AM

Screen Shot 2015-06-16 at 5.05.20 PM

Read the more techy posts on Moto here!

Making IoT visual. Using Node-RED for hybrislabs moto.

Our prototype ‘moto’ is just taking an interesting turn. It’s technically pretty robust, we’re finishing off some UI/product choice things and we asked ourselves: that’s it? While we’ve developed one story for each prototype so far, it seems we’re focusing on different, higher-level issues when it comes to IoT with moto. That’s the issue of how things are wired up, how things can interact and how existing configurations can quickly be changed.

So instead of one story, we’ll have many stories for moto. It will stay interesting, from a technical perspective, too. But the real story is: using Node-RED extensions, we’re able to rewire the logic of moto very quickly. And: it’s a tool for business-users. No hardwired setup.

IMG_20150601_173538 (1)

Just like most other IoT-focused prototypes that we have (Wine Shelf, Funky Retail, Tiles), Moto also has a REST-based web API to control the light and motion. As well as a webhooks-based system to communicate with the outer world. But because moto is also built around MQTT (just like funky retail), it is easy to extend. With Node-RED and MQTT, we have a direct hook into the core messaging system of our IoT prototype. And we’re using the extensions we’re currently working on to make the commands and events easy to wire up. Take a look at a very simple example:

Screen Shot 2015-06-05 at 4.08.13 PM

 

Here, a node triggers every 5 seconds. It will first hit the ‘red, slow, counter-clockwise’ node and the connected moto (here #2) will begin to turn red and slowly rotate counter-clockwise. After a delay of 2sec, it will turn green, move clockwise and fast. And finally after another 1 sec delay it turns off the motor and the LEDs turn white. It then starts over again.

We’re just having the first successes with Node-RED, but it looks very promising to become the brain of our prototypes. It might be smart move, as others (e.g. non-technical people, the business guys, the guys with the smart stories) can rewire moto in whatever flavor they want.

Next up in tech are the input nodes, e.g. moto can send presence events that can start interactions.

Just to wrap up, another example where we took the current temperature in Munich, converted that into a moto command and send it to moto for displaying the temperature via light:

Screen Shot 2015-06-03 at 11.12.29 AM

Let us know what you think!

Moto Arch: pretty final

We’re really in the final phase of finishing moto – at least from a technical side. Our friends at SNK (Kathi) has produced an awesome arch poster which I mainly wanted to share with that post:

Screen Shot 2015-05-07 at 9.18.17 AM

To recap: the main purpose of #hybrislabs moto is to figure out how BLE devices in the retail space can be connected to the internet via the employees in the retail space themselves – via an app on a smartphone. This is an interesting topology that we have not touched yet. Soe there is no hub, the smartphone apps of employees will take over that part. The things get connected based on reachability from an employees phone. While we did lot’s of things (including NFC etc) on mobile devices already, we never implemented a true BLE/MQTT gateway as a smarphone app. I am really happy we’ve fixed that.

By the way – have you discovered Bluz on Kickstarter? Same idea. The bluz hardware is connected to the Spark Cloud (which gives RESTful APIs, webhooks, data logging, etc.) via a gateway app which lives on a smartphone.  Really the same idea, our implementation is a bit more narrow and probably not as generic as theirs.

Some more news:

  • Namespaces: the app now operates under a namespace. So all employees of Store X can use the namespace X. That means all connected things will report/forward events/commands under the appropriate namespace to the server. The server now also has namespaced UIs, e.g. a /groups/default path will present the connected motos for that namespace.
  • Webhooks: for each namespace, webhooks can be set up. A webhook is a callback mechanism for all 3rd party or harder to integrate systems that can only speak HTTP. Instead of accessing our MQTT broker directly (which requires port and protocol access, tricky in especially enterpise environments) a webhook can deliver the events from things connected via a HTTP Post request with JSON data as a payload. Works amazingly well and solves most of your integration problems.
  • REST API: while webhooks solve the problem to report events to legacy or third-part HTTP-based systems, the REST API allows other systems to access the thigns, e.g. allows them to send commands down to the device. We’ve created a simple REST API that will forward the requests to the MQTT broker, which then talks to all subscribers (e.g. the smartphoen apps that finally will relay all that to the thigns via BLE).

While I am visiting ThingsCon in Berlin, I hope we’ll have some progress on the web UIs and hardware side to share soon. I expect new hardware next week, as well as an updated web UI. A few more iterations with our awesome friends at SNK and DerGrueneFisch. And we should be ready to show the final version. Enjoy.

Moto Update: the smartphone is now our MQTT/BLE Gateway

It’s time for an update about ‘moto’ – sorry that this did not happen earlier but I’ve been busy with events like #cebit, #iotcon or #internetworld. We’ve now finalized the hardware design and our good friends at DerGrueneFisch are manufacturing a small series of moto prototypes (9 to be exact) in the coming weeks. This also means that I’ve moved on to more software-related challenges instead of hardware challenges.

Moto Architecture Diagram

If you remember the architecture diagram (find it again above), we connect the motos wirelessly via BLE. While I’ve been using some quick & dirty node.js based scripts on my mac for testing the communication over BLE, I’ve now written an Android app that acts as a MQTT/BLE gateway. Powered up, it will launch two services: the BLEService and the MQTTService. These services are started and then continue to run in the background. They are loosely coupled via Android Intents. Right now, we fire up the services when the Android Activity (that’s what is “shown on the screen when an Android app fires up) is shown. And we stop these services again, once the app becomes invisible. This is really convenient for testing, as we tear down/fire up the services a lot which is great for testing.

BLEService
This sticky service (meaning the system may restart it if it was removed due to resource constraints) is scanning for new, non-connected motos and will then try to connect. Once connected, we subscribe to notifications for one BLE characteristic which acts as  the event stream from the hardware. We also save a reference to another identified characteristic that we use to send our commands to. In order to react to commands and be able to forward events, we use Android intents. The BLEService registers listeners for all intents that the MQTTService is sending out, as they contain the moto commands that need to be forwarded to the moto’s. The BLEService also maps the incoming commands to the corresponding motos and – new – now is namespaced. That means the users of the Moto Android App will later be able to choose their namespace so the analytics data is kept separate from others.

MQTTService
For MQTT, we’re using the only Android/Java MQTT client we were able to get: Paho. Although there seems to be an existing Android Service wrapper around the Paho MQTT client, that one is little documented and it really was simpler to create our own service that does exactly what we want it to do. The MQTTService is again sticky and should be running all the time. It tries to keep a constant connection to the MQTT broker that we host on Amazon EC2. It is subscribed to all commands that fall into its namespace, e.g. moto/<namespace>/+/command – which is an MQTT topic with a +wildcard, meaning it will receive messages sent to moto/<namespace>/1/command for example.

Getting MQTT or BLE running on Android alone and for  a small demo is pretty easy. The complexity comes one you try to connect to multiple devices at once, because the Android BLE APIs are synchronous and firing too many BLE requests at once will simply override a few requests sent. So one has to work with a few delays and timers here and there to make sure it really works reliably. The idea is also, that sales agents with the app installed can roam freely and if one is close to the BLE devices, their phone/app will connect transparently. So far, this works realy nicely. After a few seconds outside of the coverage area, the BLEService starts to receive disconnect callbacks and we start removing the moto element from the list of connected ones. This will enable it to be added by another sales agent and his/her device that has the app installed.

The Protocol
At least for now, I’ve also frozen the “protocol”, e.g. which characteristics are used, what data is sent, how it is determined what is possible and what not. First of all, for sending and receiving data from/to the moto elements, I use two seperate BLE characteristics. This simply keeps everything a bit more organized and easier to understand. For sending from the BLE hardware to the smartphone, struct-based events like these are used (this is straight from the Arduino IDE):

Mainly due to issues with setting up multiple BLE notifications from Android at once, I decided to distinguish the two events that I send out via the first byte – see the “eventType” byte which is different for a PresenceData Event and MetaData event.  MetaData Events are sent our regularly to inform the smartphone and the server later that a device is live. We visualize the MetaEvents again via heartbeats. You can tell within 10 seconds if a device is connected or not. The PrenseceData Events are sent whenever the presence state (customer in front/customer lost) changes. Just like with tiles, we also calculate the duration of the presence directly on the device.

For incoming data, so-called moto commands, the protocol is slightly more complex. We distinguish between two broad categories of commands:

  • “standard” commands can change the current RGB colors and the motor state (this includes on/off, direction and speed level of the motor)
  • “special” commands are distinguished from normal commands by the value of the first byte. To be able to extend the command mechanism, they introduce a “subcommand” byte as the second byte. From the third byte on, the special command’s data is sent. Right now I’ve specified a “blink” command that will blink the RGB pixels for a certain duration in a certain color. Another command implemented is a rainbow chase, so the pixels will update according to a color wheel which looks like a rainbow in the end.

Some code example showing how I deal with incoming commands is below:

Android UI Adapter
One last element that got a lot of love from me is a special UI Adapter for the Android app. There’s nothing super special about this data/UI binding, it is just a lot of work. The UI will later try to come close to the action that the moto element is performing: if it blinks, the UI element in the android app will blink, colors will be reflected as well as possible and of course the spinning status will be represented. Once I have a few motos connected at once, I will shoot a few pics and show this to you in an update.

Up next
Now that I have the hardware specced out and a running gateway prototype via the Android App, the next thing that I’ll spend time on is the server side that collects all the data. This will also include a RESTful API to control each moto element, client/server communication via socket.io for the UI and early ideas for the skinning. I hope to receive the first elements of the produced series within 2-3 weeks and will try to update you on the progress made.

 

Moto: exploring the smartphone as an IoT hub for retail

At #hybrislabs, we’ve explored IoT quite a bit now. We’ve begun with the smart wine shelf, our first IoT experience for the retail spaces that used a unique idea and combination of technologies to provide both customer and retailer value. Next up was funky retail, where we focused on the analytics in the retail space with both distance and pressure sensors. With tiles, we went wireless for the first time – but still used a central hub where all Bluetooth LE messages are collected and forwarded to the cloud.

Finally, with moto, we’re now filling a gap. We would like to explore one missing IoT topology in our portfolio: using the smartphone as a hub for the connected devices around you. Below is a pic how the current prototype looks. In the end, it will be a glass-protected, spinning disk that is lighted up from below. It will feature an IR distance sensor to detect customers, be able to change rotation speed and direction as well as the color. It will require a power cable, but communication will again be bluetooth low energy.  Here’s also a video of moto from a recent G+ post.

IMG_20150303_163718

What is more important and sadly almost invisible is *how* we connect these IoT elements. We’ll not use a central hub. Instead, the plan is to have iOS/Android Apps installed on the sales assistants phones that automatically connect to the retailers smart objects. These apps on the smartphones connect via BLE and forward the data to/from the cloud to/from the the things. The idea is that a sales assistant can freely move in the retail space. The app will scan and connect, might loose the connection from time to time and leave one “moto” disconnected, later move back in range and reconnect. If another sales assistant with the same app and configuration moves in range, he will take over.  Here’s the architecture:

Moto Architecture Diagram

At this time, we’ve successfully connected to the moto’s and defined the rough BLE-based protocol that we’ll use. We’ve got some node.js based code that works on a Mac for experimenting and testing. Next up will be the task to write a good Android app (iOS welcome, too), that launches, finds IoT elements, connects and then proxies the communication to the cloud. For the cloud communication, we’ll again use MQTT but still need to find a good and easy MQTT solution for Android/iOS. So if you have any good ideas and are able to point into the right direction, let us know! (@hansamann or comment – we actually do read them!)

To wrap this up, here’s the raw PCB of moto with the neopixel RGB ring and IR distance sensor connected to the PCB. The board again uses a LightBlue Bean for the BLE connectivity. As it is running on 9V for the stepper motor (which is not shown here), we need to step down the voltage twice from 9V – one time to 5V for the neopixel RGB LEDs, another time to 3.3V for the ligthblue bean. We’re also using a stepper motor driver, DRV8834 on a breakout,  that allows us to control the direction and speed of the stepper motor.

IMG_20150303_161139