Hybris Labs in 2016 – The Year of IoT

Parallel to the SAP Hybris Digital Summit  2017 we want to take a look at the past year through the eyes of Hybris Labs. We decided to name our 2016 “The Year of IoT”. The very first Labs IoT prototype does of course date back to 2014 and came in the shape of the original Smart Wine Shelf. But what we like to refer to as the “mastery of IoT” involves achievements such as the replication and adaptation of our prototypes, enabled through YaaS, the distribution of our demos to events across the globe, and the very first Hybris Labs prototype with an SAP Hybris customer.

 

BASF & Hybris Wine Shelf

After showing Bullseye at the SAP Hybris Global Summit in Feb 2016 it was “HEINEKEN” who first approached us with a customisation request for an internal event. The result was the beer selector which we then later also showed at the Hybris Americas Customer Days in Fort Lauderdale.

“BASF”, a world leading chemistry company but also one of Germany’s largest wine distributors, decided to take the Bullseye prototype one step further. The BASF & Hybris Wine Shelf is the first pilot project Labs engaged in together with a customer and is currently in use in the BASF “Weinkeller” in Ludwigshafen.

 

Bottomless shopping carts and robots on trucks

There was more to the past year than just Bullseye. The close collaboration with the SAP Hybris customer “hansgrohe” allowed us to frequently present our Infinite Cart prototype in an eye-catching setup and produce a video that is equally stimulating to its viewers.

Without living up to the clichés that are commonly associated with this term, Hybris Labs are often referred to as rock stars. Reluctantly accepting those honours, we did manage to get one of our team members on a tour bus – the “Beyond CRM Truck”. His name… her… its name: Pepper.

Pepper was the only one of us tough enough to deal with life on the road. As an extension to Bullseye, Pepper handed out candy to truck-visitors all across Europe.

 

The golden age of computed artificiality

In the past our lay on the digitization of the physical retail space. In 2017 Hybris Labs will be exploring the potential of virtual reality, augmented reality, and artificial intelligence in the shape of voice controlled digital assistants and conversational commerce. We’re hoping to present the first results of this research in spring. With other words: New Hybris Labs prototypes are coming soon!

“What the computer in virtual reality enables us to do is to recalibrate ourselves so that we can start seeing those pieces of information that are invisible to us but have become important for us to understand.” – Douglas Adams

Perhaps we’ll even find the question to the answer “42”…

In-Store Targeting and Analytics – on YaaS!

It’s finally time to write about a new project we’re working on. Hopefully this also helps to clear up a few open issues we’re still working on. So here’s some news about a project we’ll probably name “bullseye”. To some degree it is an extension of the wine shelf. But it’s super flexible in terms of configuration and products. And – boom – it’s almost 100% based on YaaS, the new hybris commerce APIs.

Architecture, rough… 

This architecture is rough and can change any moment, but it’s a good ground to describe what this is about. The idea itself – again – is about selecting products in the physical retail space. And also about providing feedback to the retailer about physical interactions with products. YaaS plays a big role as we use the YaaS Builder Module system to edit all the configuration of the system. We’ve also written our own YaaS service, that provides the product matching logic in a completely tenant-aware fashion.

Bullseye - plat Technical Architecture (1)

Platforms and Bases = Smart Shelf

From a technical perspective, the hardware used is less impressive. It’s really not the focus this time. We’ve worked on a 3D-printable design that contains the electronics for the hardware parts of this prototype. Each of the platforms below (so far we have about 20 fully working platforms) contains a microcontroller for the logic, a large 24 NeoPixel LED ring (output) and a LDR (light dependent resistor, input). The platforms connect via Micro-USB to a base (power, serial data), which most likely will be a Raspberry PI again. In between, we need standard USB 2.0 hub, as  a Raspberry PI has only 4 USB ports and we would like to power as many as 20 or 30 platforms from one base. Check out some images below.

IMG_20151210_101416 IMG_20151210_103701
IMG_20151021_153508 IMG_20151210_103708

The firmware that runs on the platforms is able to receive a few commands over a custom serial protocol. Via this protocol, we can change the identity of the platforms (stored in EEPROM), read the sensor value or issue a light effect command (e.g. turn all pixels on, turn them red). It’s a fairly low-level, basic, communication protocol. The only business-level logic that so far still runs on the microcontrollers is the calculation of liftup times. We count the duration between the increase of light (product lifted) and the decrease of light (product down). To not interfere with the NeoPixel (light) ring, we’re blocking the event calculation during the light effect execution.

The bases, most likely Raspberry PIs, each have a unique ID. The platforms, again, have unique IDs. Via MQTT (node.js using MQTT Client Software) we can issue commands to the bases and to the platforms directly.

MQTT Broker

An important architecture component that we can’t live without is the MQTT broker. Due to port restrictions and other technical issues, this part is currently outside of the YaaS cloud. For now, the bases connect to the broker to connect the platforms over serial. The bases subscribe to MQTT topics that match the platform ids. They also subscribe to a base-level topic, so we can send base-wide commands. If a platform disconnects from a base, we unsubscribe from the MQTT topic of that platform. This ensures that the communication bandwidth required is lightweight.

YAAS Builder Module

The builder module that you get once you subscribe to our package in the YaaS Marketplace allows you to configure the physical mapping and the questionnaire that the end-user finally gets to see. The products derive from the products you’ve configured via the YaaS product service. Below are a few honest screenshots, before we even started styling these screens (be kind!).

As a user, you’ll first have to choose a shelf, which is identified by the id of the base. Next, you choose which product category you’re creating the recommendation system for. All products of the shelf need to adhere to a common set of attributes, hence the category. Third, you’ll assign the products of that shelf/category combination to platform IDs. Finally, the scoring configuration – which questions, which answers, which score per correct answer is specified. The scoring configuration is the key ingredient to the end-user questionnaire form. Once all four steps are completed, the retailer is given an end-user URL that can be turned into a shelf-specific QR code (or put onto an NFC tag, or put onto a physical beacon or shortened and printed, etc.).

Screen Shot 2015-12-10 at 11.37.10 AM Screen Shot 2015-12-10 at 11.37.13 AM
Screen Shot 2015-12-10 at 11.37.17 AM Screen Shot 2015-12-10 at 11.37.35 AM

YaaS Matching Service

Our matching service is triggered by a special URL that goes through the YaaS API proxy. All requests and bandwidth is counted and can later be billed. The end-user experience begins with a rendering of the questionnaire. The user chooses his answers and sends the data off to the matching service. The matching service now pulls the scoring configuration, the products and the mapping to calculate the matches. Based on the relative threshold, we calculate which products and therefore physical platforms are highlighted. Now, MQTT messages are sent out to the bases/platforms to highlight the appropriate platforms.

Screen Shot 2015-12-10 at 1.08.54 PM  Screen Shot 2015-12-10 at 1.09.03 PM

Once a customer uses the system via a questionnaire, the shelf belongs to her for the next moments. This means we block access to the tenant/shelf combination for some time. During that time, the user is interacting in a personalized session with the shelf. Lifting up a product results in the display of detailed information directly on the customer’s tablet or smartphone. And of course, it fuels a few analytics displays that still need to be detailed.

What’s next? tons of work ahead.

We’re working hard on the specs for the initial version of this prototype and some sample products, categories, configuration that we’ll use for the hybris customer and partner days in Munich 2016 (early February 2016). But we’re also thinking of a few extra features that might make it into the prototype by then: for example, we’re thinking of a stocking mode, in which the platforms highlight one after each other and the screen shows you the product that needs to be on. It helps both the labs member to setup a demo as well as the retail employee to stock a shelf. And we’re thinking of sending the recommended products via email. A customer could then continue the shopping at home which a pre-filled cart.

Got ideas? Let us know. This is the time to provide input!

IoT Workshop with Particle Photon

Just a quick update from an internal IoT event here at hybris. Today, we ran our IoT Workshop for the first time and had a hell of fun. We used our carefully crafted IoT Experimentation Kits which included a Particle Photon to educate 15 people about the internet of things. And yes, we’ve connected our first buttons to the YAAS PubSub Service, which was a lot of fun!

Below, you see the box that every participant received, plus a few other pics documenting the fun. At the beginning, we had quite some trouble to get some devices online. We bricked three devices, but in the end everybody was happy and got some YAAS buttons/LEDs connected.

IMG_20150922_092033

We’re now looking forward to seeing some cool creations over the next couple of weeks! Here are a few more impressions:

IMG_20150922_114951 IMG_20150922_114937 IMG_20150922_120348

IoT with Arduino Yun and YaaS

Arduino Yun IoT

The Arduino Yun hooked up to an LDR light sensor

Inspired by Georg’s post about connecting the ESP8266 to the upcoming hybris as a Service (YaaS), I thought it would be great to connect an Arduino microcontroller to the YaaS platform to showcase an Internet of Things (IoT) scenario. To keep this proof of concept (PoC) small, I decided that fetching an OAuth2 token and posting a sensor value to the YaaS Document Repository would be a good start.

The Arduino however does not have any out of the box capability to send data to the Cloud. There are many modules (or shields) which are able to connect the Arduino to the Internet, including Ethernet, WIFI or using Bluetooth Low Energy (BLE) through a gateway. Due to the Arduino computing and memory constraints, almost all of those connection options cannot levarage security layers such as HTTPS or TLS. The hybris YaaS platform though requires data to be sent using HTTPS which does not leave a lot of options for secure IoT with the Arduino.

While looking into different options I saw that we had an Arduino Yun lying around in the office and so I decided to use it for the PoC. The Arduino Yun is a hybrid microcontroller board which includes a full-blown Linux System-on-a-Chip (SoC) as well as the same AVR chip found on an Arduino Leonardo. The Arduino IDE includes a Bridge library which lets the Arduino microcontroller talk to the Linux SoC over a USART serial connection.

Instead of implementing an Ethernet and/or WIFI driver plus TCP/IP and HTTP for the limited AVR microcontroller, the Arduino team created a lightweight wrapper for the CURL command line HTTP client which is called over a serial bridge on the Linux SoC.

Unfortunately, this library currently implements HTTP and not its secure variant HTTPS nor sending POST requests. In order to make web service calls to YaaS, I had to implement my own little wrapper around CURL for fetching YaaS tokens and for sending secure POST requests to the YaaS Document Repository.

The sensor that I am using is an LDR (Light Dependent Resistor) which is connected to the Arduino’s Analog-Digital-Converter (ADC) port A0 using a voltage divider circuit. The circuit was set up on a breadboard and connected to the Arduino Yun.

arduino_ide_screenshot

YaaS client library in the Arduino IDE

I am using the Arduino String library to construct the JSON strings for the HTTPS request bodies. The Arduino Yun YaaS library I created only implements the following features: requestToken(), securePostRequest() and a super simple jsonStringLookup() to parse the token from the JSON response. Packing this functionality into 32kb of program memory and 2kb or RAM of an Arduino was a real challenge. When running out of RAM on the microcontroller, things will just stop working without any warning. The Arduino IDE only offers serial messages for debugging.

arduino_serial_screenshot

Successfully requested OAuth2 token and uploaded a sensor value to the YaaS document repository

In the process of writing the YaaS HTTPS library, I realized that doing requests by wrapping CURL requests does not provide a great level of flexibility when it comes to error handling and retries of requests. There are also some drawbacks to having a Linux SoC on the same microcontroller board: the difficulty of battery operation due to its power requirements or keeping a full-blown Linux system maintained and secure over time.

With my Arduino Yun PoC I have proven that it is possible to connect an Arduino to our new microservices based cloud platform YaaS. I learned that doing HTTPS requests on an 8-bit microcontroller is only feasible if you are using a more powerful gateway such as the Linux SoC included in the Arduino Yun. As a next step, it would be great to be able visualize the sensor data which I am pushing to the document repository. That’s a story for another blog post though.

Max, what are you soldering there?

“Ouch! (his fingers obviously…hihihi…) I’m just building a prototype board to connect the TV screens in the labs space to a logical control system which we can then use for IoT purposes. And to connect this we need a special adapter, that’s what I’m just soldering.

IMG_2684 1

“How does that work? Not the soldering, the system.”

“It’s basically just a serial adapter which uses an Ethernet connector instead of a normal RS232 connector. And because we need something like this for a Raspberry Pi and we don’t want to stack three adapters in a row, we’re building it on a circuit board ourselves.”

“Tell us a bit more about why you’re doing this.”

“Our existing video solution in the labs space isn’t really satisfying with the function it provides. We want to have a video system that allows us to control all the time which video is playing so that we can use the system to display events generated by other IoT prototypes, e.g. Funky Retail. If a customer’s presence is detected,Funky Retail would normally just light up and a video would play. And this is all static, right? Meaning, the Raspberry Pi that is built into the Funky Retail System would then play a video. And what we could do is use this event ‘customer presence’ to trigger some action on any of the screens in the labs space. It doesn’t nee to be a simple video, it could be anything. Things like: you walk over from one screen to the next and the video follows you with the right seek position in the video itself.”

“How exactly did you solve the problem you had connecting to the TVs?”

“That’s what I’m doing right now. The thing is, each TV needs two connections. One is HDMI, which is no problem, and the other is a serial connection which we can use to control the screens we already have in the labs space. We could use some other technology to control the TVs, but the ones we have don’t support this technology. It’s called HDMI CEC (Consumer Electronics Control) and it only exists in consumer TV screens, but we have professional screens. That’s why we need to use the professional controlling option which is a serial connection. So we need two cables to each screen. And what I’m soldering right now is the adapter for a Raspberry Pi to control each screen. And this is done via an Ethernet cable. What happens is, each Raspberry Pi will have an HDMI connection to a screen and a serial connection. Then we’ll have one controlling Raspberry Pi for each screen.”

“And at the end we can control the screens from wherever we want through the internet?”

“Yes, that’s the point. The Raspberry Pis will then connect to a central broker system which is basically just a server on AWS. … Damn, too short! (I think I might be distracting Max here a bit…)
So, I’m just solving the hardware problems right now. The stories or the applications we can build based on this are much more interesting than the hardware. This will be included into other prototypes as an output source. We have a lot of input sources in our IoT prototypes, e.g. pick up events, presence event, etc., but most of our output events are flashing lights and web UIs. If what I’m building here works, we could utilise any TV screen that is totally independent from our other prototypes as an output. That’s the idea and we’ll have the applications pretty soon. The easiest one is just to replace the static environment for Funky Retail, which already plays a video when you pick something up, and uncouple the playback of the video from the controlling Raspberry Pi of Funky Retail. This totally makes sense.”

Thanks Max, hopefully we can celebrate the success soon! And be careful with that soldering iron…

Andreas, no Lego this time?

The last time I asked Andreas what on earth he was doing, his desk was full of Lego. Quite a valid question in that case. The result of this work is called Augmented Commerce.

This time things look a little more ‘conservative’…

IMG_2629 1

…but still: “Andreas, what technology are you just working on?”

“I’m working with some old friends of hybris labs: a Raspberry Pi and an Arduino. And of course we have the usual suspects like Neopixel rings and so on…  But this time we’re also trying to connect to YaaS.”

“Ok, now what connects to what? What does what and why?”

“The Raspberry Pi does the talking to the YaaS platform, taking control of the whole process. And the Arduino is responsible for the hardware stuff, like LED’s and proximity sensor. This time we’re also using a coin slot which is kind of new. Basically, we’re trying to enable the customer to pay on the YaaS platform with coins.”

“So, we’re building a smart vending machine…”

“I wouldn’t call it that, because it can be more. The idea is that you can connect anything to what we’re building here. We’re not providing a complete vending machine, but only the YaaS-connection. You have your front end through which you can pay and interact with this device, but it’s only the interface of YaaS. What you connect it to in the end is up to you.”

“Apart from lights flashing when I throw in a coin, what else happens? There’s something going on with NFC or RFID, right?”

“At the moment it’s RFID (developer language for: ‘I’ve got no idea if this is going to work. It could be anything at the end. RFID, NFC, telepathy…’). The idea was that somehow you have to authenticate yourself to access your YaaS account. We wanted to do this with RFID, at least that’s the plan for the moment (for translation see above). So, you go up to the machine and through the proximity sensor it notices you’re approaching. It greets you and says ‘Hello, please swipe your card, NFC tag, RFID token, whatever… (Ok, now I’m being a bit mean. To be fair, the things Andreas builds usually work really well. There is a reason his second name is Brain.) Once it knows who you are, you can order the product of your choice. Then you are asked to please pay. You can either throw in coins, or you use the credit you already have on your YaaS account, or you pay by Bitcoin.”

“So, you can either pay physically or digitally via your account. But when you overpay with coins you don’t get any change back, right?”

“That’s the idea, because it simplifies our device. We only take money and don’t return any…”

“Makes sense, sounds good.”

“…yes. What you overpay goes directly to your YaaS account and you can use it the next time.”

Thanks Andreas, we’re curious to see the finished prototype.

Intro to IoT with the ESP8266 microcontroller board

Today we want to give you an introduction to a new module that has gotten a lot of attention in the IoT community lately. It doesn’t have a sexy name like the Particle Cores, Espruinos, LightBlue Beans and all the others out there. The name is simply ESP8266. Anyone exploring IoT components these days has probably run into one of the ESP8266 modules.
The first time I came across this little module was in August 2014 on the hackaday blog . Back then it seemed to be more of a cheap alternative to existing WiFi modules for Arduinos with a price tag of under $5, compared to the Arduino WiFi shields that run around $40. The SDK for the SoC of this module was not very mature and most of the documentation was available in Chinese language only. The only usable firmware supported AT commands.
But luckily a lot has changed since then. There are now more than 12 different variations of the ESP8266 available, multiple Firmwares, many projects are using it and it’s becoming more and more popular every day. With this post we want to give an overview on this.

 

Available module variants

ESP-01

ESP-01

  • Dimensions: 14.3mm x 24.8mm
  • PCB antenna
  • GPIO0/2/16
  • Very common module

 

ESP-02

ESP-02

  • Dimensions: 14.2mm x 14.7mm
  • U-FL connector
  • GPIO0/2/15
ESP-03

ESP-03

  • Dimensions: 17.4mm x 12.2mm
  • Ceramic antenna
  • GPIO0/2/12/13/14/15/16
  • Very common module
ESP-04

ESP-04

  • Dimensions: 14.7mm x 12.1mm
  • No antenna
  • GPIO0/2/12/13/14/15/16
ESP-05

ESP-05

  • Dimensions: 14.2mm x 14.2mm
  • U-FL connector
  • No GPIO
ESP-06

ESP-06

  • Dimensions: 14.2mm x 14.7mm
  • No antenna
  • GPIO0/2/12/13/14/15/16
  • Metal shield claims FCC
ESP-07

ESP-07

  • Dimensions: 22mm x 16mm
  • Ceramic Antenna & U-FL connector
  • GPIO0/2/4/5/12/13/14/15/16
  • Metal shield claims FCC

 

ESP-08

ESP-08

  • Dimensions: 17mm x 16mm
  • No antenna
  • GPIO0/2/12/13/14/15/16
  • Metal shield claims FCC
ESP-09

ESP-09

  • Dimensions: 10mm x 10mm
  • No antenna
  • GPIO0/2/12/13/14/1

 

ESP-10

ESP-10

  • Dimensions: 14.2mm x 10mm
  • No antenna
  • No GPIO
ESP-11

ESP-11

  • Dimensions: 19.3mm x 13mm
  • Ceramic antenna
  • GPIO0/1
ESP-12

ESP-12

  • Dimensions: 24mm x 16mm
  • PCB antenna
  • ADC + GPIO0/2/4/5/12/13/14/15/16
  • Very common module
  • Metal shield claims FCC

Here are some modules we rarely see being used, but we want to mention them here for completion.

WROOM-01

WROOM-01

  • GPIO0/2/4/5/12/13/14/15/16
WROOM-02 / ESP-13

WROOM-02 / ESP-13

  • GPIO0/2/4/5/12/13/14/15/16

 

ESP8266 ESP-12 wiring

In our exploration of the ESP8266 we have mostly focused on the ESP-12. We had a bunch of ESP-01 and ESP-12 available that we got off of AliExpress, but ESP-12 was mainly chosen because it has more GPIO pins.
The wiring for flashing a new firmware is pretty straight forward. In addition to connecting the VCC and GND you need to pull up the CH_PD and GPIO02 and pull down the pins GPIO0 and GPIO15. After running into unstable behavior of the module, we decided to add a dedicated power supply to have a constant current. The last missing part to flashing a new firmware is an FTDI USB-serial adapter. Connect the TX to RX, RX to TX and the GND. We don’t need the VCC since we already have a power supply. With the wiring as shown below, we should be ready to get our firmware on the ESP8266.

Wiring of ESP8266 ESP-12

Wiring of ESP8266 ESP-12

 

Available Firmwares

Espressif’s Official Firmware

The official firmware from Espressif gives you the best performance and control of your implementations with more memory available for code than the others below. There are many projects using this firmware, the most popular is probably esp_mqtt. The downside with this firmware though, is that you have set up a full toolchain on your development machine which can take some time. Even though some of the libraries were not open sourced there are very frequent releases at the moment. This helped us to get the ESP8266 connected to the hybris-as-a-Service offering over HTTPS as the older firmwares have a broken SSL library.

NodeMCU

The NodeMCU firmware was initially released at the end of 2014 and allows you to write your application code in Lua. This firmware is also very popular and allows quick prototyping. Unfortunately the NodeMCU is based on an older version of the official firmware. Due to memory restrictions this base can’t easily be upgraded as the remaining memory for custom code would be too low to do any serious coding. This also means that the recent fixes made to the SSL libraries etc. are not available in NodeMCU.

Frankenstein

This Firmware, as the name suggests, consists mostly of different bits and pieces that are publicly available. It is meant mainly as an alternative to an AT Firmware and has a limited control of GPIOs.

Sming

An open source and native firmware that allows you to work with GPIO in Arduino style. Comes with great built-in modules but is also compatible with Arduino libraries. But unfortunately based on an older version of the official firmware.

 

Other firmware projects in progress

Micro Python Port for ESP8266

  • highly experimental
  • Python REPL over UART0
  • Garbage collector

Espruino Port for ESP8266

  • Port of Espruino JavaScript engine
  • Slow progress

 

Toolchain set up on Mac OS X Yosemite

As mentioned above we would need to set up the toolchain in order to be able to compile the firmware and flash the ESP8266. In the following walk-through we’ll assume you have the drivers installed for the FTDI USB serial adapter. In most cases it should be pretty straight forward.

 

Essentials

First we need to install some essential tools that we would need to build the toolchain. We are mostly using homebrew to install additional software but you will also find same packages on MacPorts.

 

Toolchain and SDK

In the next step we need to create a case-sensitive filesystem, clone the sep-open-sdk repository and build it.

Once we have a successful build we need to add the toolchain path to our environment. Edit your profile with

and add this at the bottom

Don’t forget to reload your terminal

 

Flashtool

The last missing piece for the setup is a tool for flashing the compiled firmware onto the ESP8266. We’ve used ESPTool in most cases. It’s using a Python library PySerial that you can install with pip.

For the ESPTool itself you can simply git clone the repository and

 

First build

To verify that we set up the toolchain correctly we can compile a sample project. As I mentioned above, esp_mqtt is a very popular project. So let’s just compile that.

Now we can compile our own projects, too. It would be cool to integrate the toolchain into an IDE. We would have code highlighting, code completion and would also be able to build and flash the firmware in one single tool.

 

Eclipse set up

Download Eclipse IDE for C/C++ Developers and install it. Once done, you can import the esp_mqtt project into Eclipse as shown on the screens below.

To allow building and flashing of the firmware we can simply add the make targets (all, clean, flash) on the rights as shown here.

One last important part is still missing. We need to add the PATH variable to Eclipse to help it find the toolchain. Add in the Preferences under C/C++ → Build→ Environment a PATH variable with Value /esptools/esp-open-sdk/xtensa-lx106-elf/bin and we should be ready to develop our application.

 

First request to YaaS

We had successfully used MQTT with the ESP8266, but the upcoming hybris-as-a-Service platform yaaS is following the micro services architecture and is all RESTful Webservices over SSL. As mentioned earlier the SSL libraries were broken in earlier versions of the SDK. The latest official SDK though has proper SSL support. We got the ESP8266 to talk to yaaS with the sample code from Espressif. Just configure the WiFi SSID and password in the user_set_station_config function. Replace at the top the NET_DOMAIN and TLSHEAD with your server address and HTTP request.

Compile and flash the firmware onto the ESP8266. Restart the ESP8266 module to let it run your custom code that we flashed.

There it is! The ESP8266 got it’s first token from yaaS!

Moto – "It’s so simple, even Nick can run it…”

Thanks Lars… Have you finished playing with Minecraft?…

So, what is Moto? Moto is the fourth prototype of our IoT series and again we’re shifting the focus of what we want demonstrate. When we built the Smart Wine Shelf, we concentrated on the customer experience. With Funky Retail we focused more on the analytics. Tiles was a step closer towards exploring the technological aspects around mobility. And now finally, with Moto we’re diving even further into the IoT technology and the possibilities it leverages to permanently reconfigure the prototypes functions.

IMG_2566[1]

The ‘active’ physical components of a Moto are a distance sensor, a turning platform, and a LED-ring. Moto connects to an Android app via BLE and uses the device as data hub. The components and their actions can be connected in any way that seems sensible, by the means of a programming tool called ‘Node-RED’. And exactly this the essence of Moto. ‘Node-RED’ allows users without special expertise in coding and programming (like Nick…) to configure an IoT-based system. The actual Moto and the actions taking place merely serve as an example. These actions are displayed in a web page UI through which they also can be triggered.

We’re deliberately not telling a specific business story around this prototype. Basically it’s a bit of plug & play for IoT.

Screen Shot 2015-06-05 at 4.08.13 PM Screen Shot 2015-06-03 at 11.12.29 AM

Screen Shot 2015-06-16 at 5.05.20 PM

Read the more techy posts on Moto here!

First Lego, now Minecraft. Yes, we love our bricks!

20150612_120327_HDR[5]

It is continuously becoming more difficult to sustain a touch of seriousness here… Imagine one of your colleagues just finishing to play with Lego at his desk, when the next starts “working” with Minecraft and shares the fruits on the big screen in the office. Questions do arise… Okay, let’s try to explain.

Lars is currently exploring how IoT and SAP HANA fit together with a very simple approach. He’s measuring the room temperature in the office. Don’t laugh! This is serious stuff! Yeah okay, it does sound a bit ridiculous actually… but here goes: Lars connected a temperature sensor to a BeagleBone Black which sends the data to the SAP HANA Cloud Platform. From there it goes to a Raspberry PI that has Minecraft running on it and is connected to a TV. So much for the basic architecture. Here’s the idea: Lars wrote a script with which the temperature data can be displayed in a Minecraft landscape. Why? Because this is what it looks like in SAPUI5:

Screen Shot 2015-06-22 at 1.14.45 PM

Lars is planning to add mores sensors. Suggestions, anyone? Maybe even with a commerce related background…?

Making IoT visual. Using Node-RED for hybrislabs moto.

Our prototype ‘moto’ is just taking an interesting turn. It’s technically pretty robust, we’re finishing off some UI/product choice things and we asked ourselves: that’s it? While we’ve developed one story for each prototype so far, it seems we’re focusing on different, higher-level issues when it comes to IoT with moto. That’s the issue of how things are wired up, how things can interact and how existing configurations can quickly be changed.

So instead of one story, we’ll have many stories for moto. It will stay interesting, from a technical perspective, too. But the real story is: using Node-RED extensions, we’re able to rewire the logic of moto very quickly. And: it’s a tool for business-users. No hardwired setup.

IMG_20150601_173538 (1)

Just like most other IoT-focused prototypes that we have (Wine Shelf, Funky Retail, Tiles), Moto also has a REST-based web API to control the light and motion. As well as a webhooks-based system to communicate with the outer world. But because moto is also built around MQTT (just like funky retail), it is easy to extend. With Node-RED and MQTT, we have a direct hook into the core messaging system of our IoT prototype. And we’re using the extensions we’re currently working on to make the commands and events easy to wire up. Take a look at a very simple example:

Screen Shot 2015-06-05 at 4.08.13 PM

 

Here, a node triggers every 5 seconds. It will first hit the ‘red, slow, counter-clockwise’ node and the connected moto (here #2) will begin to turn red and slowly rotate counter-clockwise. After a delay of 2sec, it will turn green, move clockwise and fast. And finally after another 1 sec delay it turns off the motor and the LEDs turn white. It then starts over again.

We’re just having the first successes with Node-RED, but it looks very promising to become the brain of our prototypes. It might be smart move, as others (e.g. non-technical people, the business guys, the guys with the smart stories) can rewire moto in whatever flavor they want.

Next up in tech are the input nodes, e.g. moto can send presence events that can start interactions.

Just to wrap up, another example where we took the current temperature in Munich, converted that into a moto command and send it to moto for displaying the temperature via light:

Screen Shot 2015-06-03 at 11.12.29 AM

Let us know what you think!