Lars, what are you doing?

We’re continuing our safari through the labs area of the hybris office in Munich. Today we’re visiting Lars in his natural habitat…

IMG_2675 1

“Lars, what are you doing?”

“At the moment I’m working on Bluetooth Low Energy (BLE). As you can see I’m using beacons to read temperature via BLE.”

“Yes, obviously… Why?”

“Because I can… It’s just an interesting way to get ideas about how to use BLE for the transportation of  data. It’s not only temperature. You can also create services that allow you write and read out data through Bluetooth. And this is a  prototype that reads the temperature out of a service.”

“And what components are you using?”

“This is an ‘Intel Galileo’ with a Bluetooth and a WiFi shield on it. An ‘Intel Edison’ already has that integrated and I’ve actually got one, but I’m using it for something different. I’ve attached an LCD display to show the temperature and the Beacon ID. And I’m using the Node.js module ‘noble’ to scan for beacons around the office and then read the BLE data from a specific one.”

IMG_2628 1

“What are you trying to achieve?”

“The final goal would be to store  and read some data on a BLE Beacon. So I’ll be looking a bit more into the hardware and will use an ARM Cortex-M microcontroller and a BLE processor.”

“I heard you guys chatting about some new BLE stuff recently. What’s that all about?”

“Oh yes, last week Google came up with ‘Eddystone’, an open Beacon format. ‘Radius Networks’ was so kind to send me some Eddystone Beacons and I’m playing around with them.”

“What’s the benefit?”

“Eddystone will work with Android and iOS and combines iBeacon and the Physical Web, which is also a Google an project in its early stages.”

Thanks Lars, and just let us know when you need a bigger desk…

What on earth are you doing, Georg?

When you walk through the hybris office in Munich and approach the area where the labs team sits and works, you might occasionally witness some slightly peculiar behaviour. But once you get closer…you’ll probably ask yourself what on earth is going on here?! So, when I recently saw Georg repeatedly banging a ‘Maßkrug’ (the traditional Bavarian beer measure) against another glass, I asked him exactly that question.

IMG_2627 1

To be precise, I let that image settle first and asked him a little later “Why is there a Maßkrug on your desk?”

“Because I’m building something for the Oktoberfest of Things.”

“What are you building?”

“I’m building something that will detect how people connect during the evening while drinking beer. So every time people clink glasses, an event will be sent out from this group of people to their phones. And we’ll use hybris CDM to basically correlate which people are drinking together and how this changes through the night.”

“How exactly are you going to detect that?”

“Well, we have an accelerometer here and a shock sensor, so we know when a mug hits something (hopefully another mug…) This data will be connected to the phone and then sent to YaaS.”

“So what happens if I steal someone else’s Maß?” (Very clever question…)

“Then you’ll be him. (ey???) No, no… so, you attach this ‘thing’ to your Maßkrug, then you open the app and scan for this… let’s call it a cheers-detecor. You can pair it, so it’s only connected to your phone and via that we send the data back to the cloud. Each time you get a new Maß you’ll have to reattach the detector. I’ll still work on the design to make that more convenient. Your phone will have an active connection to your detector, but it can also scan for other ones, because they’ll be acting as beacons.”

“Does that mean I can see how much someone has already drunk before I start talking to him or here?”

“You could see how often that person clinked glasses…and then maybe you could do some calculation…”

“Any data privacy issues?”

“Well they have to actively opt in. So, if you pair your phone with the detector and take part in this experiment, then you allow us to use the data.”

Thanks Georg, I think we’re all quite curious to see what comes out of this!

Moto – "It’s so simple, even Nick can run it…”

Thanks Lars… Have you finished playing with Minecraft?…

So, what is Moto? Moto is the fourth prototype of our IoT series and again we’re shifting the focus of what we want demonstrate. When we built the Smart Wine Shelf, we concentrated on the customer experience. With Funky Retail we focused more on the analytics. Tiles was a step closer towards exploring the technological aspects around mobility. And now finally, with Moto we’re diving even further into the IoT technology and the possibilities it leverages to permanently reconfigure the prototypes functions.

IMG_2566[1]

The ‘active’ physical components of a Moto are a distance sensor, a turning platform, and a LED-ring. Moto connects to an Android app via BLE and uses the device as data hub. The components and their actions can be connected in any way that seems sensible, by the means of a programming tool called ‘Node-RED’. And exactly this the essence of Moto. ‘Node-RED’ allows users without special expertise in coding and programming (like Nick…) to configure an IoT-based system. The actual Moto and the actions taking place merely serve as an example. These actions are displayed in a web page UI through which they also can be triggered.

We’re deliberately not telling a specific business story around this prototype. Basically it’s a bit of plug & play for IoT.

Screen Shot 2015-06-05 at 4.08.13 PM Screen Shot 2015-06-03 at 11.12.29 AM

Screen Shot 2015-06-16 at 5.05.20 PM

Read the more techy posts on Moto here!

Moto Update: the smartphone is now our MQTT/BLE Gateway

It’s time for an update about ‘moto’ – sorry that this did not happen earlier but I’ve been busy with events like #cebit, #iotcon or #internetworld. We’ve now finalized the hardware design and our good friends at DerGrueneFisch are manufacturing a small series of moto prototypes (9 to be exact) in the coming weeks. This also means that I’ve moved on to more software-related challenges instead of hardware challenges.

Moto Architecture Diagram

If you remember the architecture diagram (find it again above), we connect the motos wirelessly via BLE. While I’ve been using some quick & dirty node.js based scripts on my mac for testing the communication over BLE, I’ve now written an Android app that acts as a MQTT/BLE gateway. Powered up, it will launch two services: the BLEService and the MQTTService. These services are started and then continue to run in the background. They are loosely coupled via Android Intents. Right now, we fire up the services when the Android Activity (that’s what is “shown on the screen when an Android app fires up) is shown. And we stop these services again, once the app becomes invisible. This is really convenient for testing, as we tear down/fire up the services a lot which is great for testing.

BLEService
This sticky service (meaning the system may restart it if it was removed due to resource constraints) is scanning for new, non-connected motos and will then try to connect. Once connected, we subscribe to notifications for one BLE characteristic which acts as  the event stream from the hardware. We also save a reference to another identified characteristic that we use to send our commands to. In order to react to commands and be able to forward events, we use Android intents. The BLEService registers listeners for all intents that the MQTTService is sending out, as they contain the moto commands that need to be forwarded to the moto’s. The BLEService also maps the incoming commands to the corresponding motos and – new – now is namespaced. That means the users of the Moto Android App will later be able to choose their namespace so the analytics data is kept separate from others.

MQTTService
For MQTT, we’re using the only Android/Java MQTT client we were able to get: Paho. Although there seems to be an existing Android Service wrapper around the Paho MQTT client, that one is little documented and it really was simpler to create our own service that does exactly what we want it to do. The MQTTService is again sticky and should be running all the time. It tries to keep a constant connection to the MQTT broker that we host on Amazon EC2. It is subscribed to all commands that fall into its namespace, e.g. moto/<namespace>/+/command – which is an MQTT topic with a +wildcard, meaning it will receive messages sent to moto/<namespace>/1/command for example.

Getting MQTT or BLE running on Android alone and for  a small demo is pretty easy. The complexity comes one you try to connect to multiple devices at once, because the Android BLE APIs are synchronous and firing too many BLE requests at once will simply override a few requests sent. So one has to work with a few delays and timers here and there to make sure it really works reliably. The idea is also, that sales agents with the app installed can roam freely and if one is close to the BLE devices, their phone/app will connect transparently. So far, this works realy nicely. After a few seconds outside of the coverage area, the BLEService starts to receive disconnect callbacks and we start removing the moto element from the list of connected ones. This will enable it to be added by another sales agent and his/her device that has the app installed.

The Protocol
At least for now, I’ve also frozen the “protocol”, e.g. which characteristics are used, what data is sent, how it is determined what is possible and what not. First of all, for sending and receiving data from/to the moto elements, I use two seperate BLE characteristics. This simply keeps everything a bit more organized and easier to understand. For sending from the BLE hardware to the smartphone, struct-based events like these are used (this is straight from the Arduino IDE):

Mainly due to issues with setting up multiple BLE notifications from Android at once, I decided to distinguish the two events that I send out via the first byte – see the “eventType” byte which is different for a PresenceData Event and MetaData event.  MetaData Events are sent our regularly to inform the smartphone and the server later that a device is live. We visualize the MetaEvents again via heartbeats. You can tell within 10 seconds if a device is connected or not. The PrenseceData Events are sent whenever the presence state (customer in front/customer lost) changes. Just like with tiles, we also calculate the duration of the presence directly on the device.

For incoming data, so-called moto commands, the protocol is slightly more complex. We distinguish between two broad categories of commands:

  • “standard” commands can change the current RGB colors and the motor state (this includes on/off, direction and speed level of the motor)
  • “special” commands are distinguished from normal commands by the value of the first byte. To be able to extend the command mechanism, they introduce a “subcommand” byte as the second byte. From the third byte on, the special command’s data is sent. Right now I’ve specified a “blink” command that will blink the RGB pixels for a certain duration in a certain color. Another command implemented is a rainbow chase, so the pixels will update according to a color wheel which looks like a rainbow in the end.

Some code example showing how I deal with incoming commands is below:

Android UI Adapter
One last element that got a lot of love from me is a special UI Adapter for the Android app. There’s nothing super special about this data/UI binding, it is just a lot of work. The UI will later try to come close to the action that the moto element is performing: if it blinks, the UI element in the android app will blink, colors will be reflected as well as possible and of course the spinning status will be represented. Once I have a few motos connected at once, I will shoot a few pics and show this to you in an update.

Up next
Now that I have the hardware specced out and a running gateway prototype via the Android App, the next thing that I’ll spend time on is the server side that collects all the data. This will also include a RESTful API to control each moto element, client/server communication via socket.io for the UI and early ideas for the skinning. I hope to receive the first elements of the produced series within 2-3 weeks and will try to update you on the progress made.

 

Moto: exploring the smartphone as an IoT hub for retail

At #hybrislabs, we’ve explored IoT quite a bit now. We’ve begun with the smart wine shelf, our first IoT experience for the retail spaces that used a unique idea and combination of technologies to provide both customer and retailer value. Next up was funky retail, where we focused on the analytics in the retail space with both distance and pressure sensors. With tiles, we went wireless for the first time – but still used a central hub where all Bluetooth LE messages are collected and forwarded to the cloud.

Finally, with moto, we’re now filling a gap. We would like to explore one missing IoT topology in our portfolio: using the smartphone as a hub for the connected devices around you. Below is a pic how the current prototype looks. In the end, it will be a glass-protected, spinning disk that is lighted up from below. It will feature an IR distance sensor to detect customers, be able to change rotation speed and direction as well as the color. It will require a power cable, but communication will again be bluetooth low energy.  Here’s also a video of moto from a recent G+ post.

IMG_20150303_163718

What is more important and sadly almost invisible is *how* we connect these IoT elements. We’ll not use a central hub. Instead, the plan is to have iOS/Android Apps installed on the sales assistants phones that automatically connect to the retailers smart objects. These apps on the smartphones connect via BLE and forward the data to/from the cloud to/from the the things. The idea is that a sales assistant can freely move in the retail space. The app will scan and connect, might loose the connection from time to time and leave one “moto” disconnected, later move back in range and reconnect. If another sales assistant with the same app and configuration moves in range, he will take over.  Here’s the architecture:

Moto Architecture Diagram

At this time, we’ve successfully connected to the moto’s and defined the rough BLE-based protocol that we’ll use. We’ve got some node.js based code that works on a Mac for experimenting and testing. Next up will be the task to write a good Android app (iOS welcome, too), that launches, finds IoT elements, connects and then proxies the communication to the cloud. For the cloud communication, we’ll again use MQTT but still need to find a good and easy MQTT solution for Android/iOS. So if you have any good ideas and are able to point into the right direction, let us know! (@hansamann or comment – we actually do read them!)

To wrap this up, here’s the raw PCB of moto with the neopixel RGB ring and IR distance sensor connected to the PCB. The board again uses a LightBlue Bean for the BLE connectivity. As it is running on 9V for the stepper motor (which is not shown here), we need to step down the voltage twice from 9V – one time to 5V for the neopixel RGB LEDs, another time to 3.3V for the ligthblue bean. We’re also using a stepper motor driver, DRV8834 on a breakout,  that allows us to control the direction and speed of the stepper motor.

IMG_20150303_161139

The Physical Web, Connected Retail and IoT. Some thoughts.

The hybris Summit is just over and the hybris labs team presented many IoT-related prototypes to the customers and partners visiting. If you are following this blog, then that’s no news 🙂 Today, #google was kind enough to send me a few “physical web” beacons and also two extra Intel Edison boards, for all more fancy ideas I might have. After some wine and wild thinking, here are some thoughts.

IMG_20150216_203005The Physical Web
If you’ve never heard this: it’s basically an Apple iBeacon but instead a crazy, cryptic, UUID which is essentially a long number, it sends around a URL to a website. The key thing here is to understand that an iBeacon only makes sense with a special app, that scans and *interprets* the UUID. This could be Estimote’s SDK that tells your APP that Beacon 124123412341324 should right now, actually, mean show a coupon for the white sneakers in the showroom. We stopped believing that every customer would have the retailers app installed that enables commerce-centric use cases with iBeacons a while ago. But scanning QR-codes for URLs, tapping NFC tags, or even typing URLs directly… really? How backwards 🙂

If only every thing would publish a URL
So now, the physical web tries to solve that problem. There will not be an app for everything. In the end, native apps won’t scale. They might be prettier and for some time looked like the only way to do mobile, but it just does not scale to the Internet of Things, where we talk about billions of smart devices. Broken down to commerce, we have not so many unique retailers around the globe compared to the complete IoT. Still, it is unrealistic that every customer walks into the retail space and has the suitable app installed to unlock the next smart wine shelf. The physical web replaces the cryptic data sent via Apple iBeacon with URLs. Only problem: the BLE advertisements are small, so some compression similar to the NFC NDEF URL Records is required. Combined with link shorteners, which are anyways great for built-in analytics, that seems like a solveable problem.

Damn, the physical web needs an app 🙂
Dammit, did I just say it’s unrealistic our customers will each have a dedicated app installed for every single store and the “Things” therein. Right now, the physical web needs an app, that scans and interprets the physical web beacons. The promise is: there will be one app. Ideally, at some point, integrated into the operating systems. Like: your browser. That would be the natural place for such a web scanning feature.

So where will our physical web beacons go?
I’ll touch the Intel Edison “dynamic physical web beacons” over the next days, but first I will attach the 10 web beacons to some objects around the office. We have a few prototypes in the hybris labs space, which each will get one. Like in a museum, each beacon will forward to a unique blog post giving you some context and additional information about the prototype. I wish our fridge, filled with beer, had a beacon so we could track the takeout of a beer and track usage per employee. Oh, one beacon should link to the Swarm (that was Foursquare, remember?) URL for our office, so people can check in easily. Maybe I should carry a web beacon, so whoever is close to me can scan the link to my G+ profile or twitter account, so he can follow me. Next time I give a presentation, I will first update a beacon with the URL to my #prezi presentation and distribute the share links like that.

IMG_20150216_210539

For the Intel-Edison based beacons, I need some constantly updating source so a dynamic beacon makes sense. The latest blog post on the hybris labs blog might make sense on first sight. But after a few extra sips of wine, a simple HTTP redirect – aka the WEB – solves that issue. The lab.hybris.com RSS feed already will redirect you to the latest blog post. So why waste an expensive Intel Edison on this? Reporting a sensor value makes way more sense. If you want to report a sensor reading, to load it directly off the web your sensor needs to share it with the web. Using a smart web beacon, I can send the browser to a local web address, then read the value. My local web address might be a retailer’s analytics system, having beautiful links to all the sensor data in my store. I’ll do that tomorrow or so…. please send us some comments or tweet me directly!

Tiles in color, plus finalized arch poster

Now that we have all major events (except the hybris xmas party) behind us, we can finally focus on getting a few projects really finalized. Tiles  made huge progress over the last weeks and I just got the fully-colored tiles in, plus I have a finalized architecture poster that I want to share with you. Big kudos go out to Elke and DerGrueneFish, our booth building partners for this and most other projects. The tiles (21 in total, for 3 complete demo sets) are colored in 4 fresh colors for a change (no boring white!). I absolutely love the way they look.  Over one day, I was soldering the first 7 which are currently connected to one hub.

IMG_20141208_114334

For the poster, Kathi at SNK did an awesome job. I already ordered our poster which we’ll then present at the hybris summit 15 in Munich at our booth. Having a descriptive poster will greatly help us to explain the IoT setup for this prototype. Right now we expect to have cans on top of the tiles, so we made that part of the poster.

tiles-90x60

 

 

Just to recap the architecture, have a read:

  • “Tiles” are the wirelessly connected platforms. We use Punchthrough’s LightBlue Bean and remove the battery holder to make the platforms 8mm high. We still use CR2032 batteries, which gives us about 1 week battery life right now. We would get more, but I send our a MetaEvent every 10sec which is hard on the battery.
  • The “Hub” collects all data. It scans for tiles, continuously, and connects. The hub runs on the raspberry pi, uses a BLE dongle (choice is key here) and uses node.js for all programming. It sends on data to the server with CoAP – a UDP-based IoT protocol.
  • The “Server” collects all data for all hubs (yep, there can be many) and provides the necessary APIs for managing the User/Tile association, authentication and authorization (Oauth2 used here), etc.

 

One change over the last days was that we can now associate products with the tiles. That means a store manager can just scan a tile (NFC or QR) and then add this tile to his private analytics page. The UI of these web pages is currently being worked on and will feature a few cool features such as a heartbeat every 10 seconds or the color of the scanned tile, that gets pulled via some static, factory-decided data. This system is all up and running now, currently with one live hub and 7 tiles connected.

14 - 2

What’s left is the callback mechanism plus the web ui. The callback mechanism will “call out” to external systems for each event reveived. So if a LiftEvent is received and a webhook is configured, we’ll send out a HTTP Post to the configured external service. I also plan to pull in the product details from YAAS, hybris’ on demand API offering.

14 - 1

 

Tiles Update – we've added blinky blinky

Our project Tiles, little BLE-connected platforms for customer interaction tracking, is entering a project phase which allows me to blog and inform you a bit more. Since yesterday night, the Raspberry PI and Arduino in the hub uses one power source. This makes the overall design easier. We also have been working on a Raspberry PI B+ hat, using Eagle, to further optimize our design.

One visible change is also that it now blinks 🙂 The hub rotates an LED light to signal the BLE scanning process. It flashes once you liftup the product, well, the apple in this case.

IMG_20141126_204216

 

We’ve now also locked down the architecture and below is a rough sketch that should help understand it. Again, a quick summary below.

tiles technical architecture

  • “Tiles” are the wirelessly connected platforms. We use Punchthrough’s LightBlue Bean and remove the battery holder to make the platforms 8mm high. We still use CR2032 batteries, which gives us about 1 week battery life right now. We would get more, but I send our a MetaEvent every 10sec which is hard on the battery.
  • The “Hub” collects all data. It scans for tiles, continuously, and connects. The hub runs on the raspberry pi, uses a BLE dongle (choice is key here) and uses node.js for all programming. It sends on data to the server with CoAP – a UDP-based IoT protocol.
  • The “Server” collects all data for all hubs (yep, there can be many) and provides the necessary APIs for managing the User/Tile association, authentication and authorization (Oauth2 used here), etc.

One more thing – I’ve connected the server to Xively, a data logging platform. We collect mainly the battery rundown to estimate battery life and also the temperature values from the lightblue beans. At this point I just want to share some nice graphs to show you how much sense it makes to track that data. It will definitely help us to optimize the design / battery consumption further. Right now we stay optimized for demo purposes, but we can later reduce the events sent for example to get a better battery life.

Screen Shot 2014-11-27 at 9.49.19 AM

Screen Shot 2014-11-27 at 9.49.13 AM

 

 

Next up: Tiles

I started blogging bits and pieces about BLE – Bluetooth Low Energy – a few weeks ago. The research we’re doing right now will help us to connect little, battery-powered “tiles” (hence the name) to a hub that collects data from these sensors. This prototype is part of our vision for a connected retail space. We’ve now received a first CNC-milled hardware prototype for the housing, which is shown below.

IMG_20141104_175100

 

We’ll now start connecting the bits and pieces. Our overall architecture consists of 3 parts:

  • the satellites –  tiles – contain LightBlue Beans – little arduino/BLE microcontrollers that run on batteries. The tiles also contain a single pressure sensor that is used to detect if a product is on top (or lifted up). The event data contains information about the tiles ID, the event (up/down) and the battery level. The mechanism we seem to support in the end will use BLE notifications that originate from the LightBlue Bean and will be received by the hub.
  • The hub is also still under development, but some early node.js code works nicely on my Mac already. The hub scans for the tiles, which send BLE advertisements, and connects to them. It receives the events via BLE notfications and will have to manage the tiles and pass on the events. The protocol we would like to use for the first time is CoAP in this case. It is essentially a binary version of HTTP, runs on UDP and is – as the name suggests – made for constrained applications. We’ve then successfully used ZMQ, MQTT and CoAP when it comes to IoT protocols.
  • The server will receive all CoAP messages from the tiles, process and persist (or at least keep) the data and allow users for the system – customers, store managers – to manage the tiles. We intend to print a QR code or attach an NFC tag to each of them. Once you touch the tiles with the NFC tag and have passed the OAuth2-based authorization, you can add the tile to your personal analytics view. The goal is to make it really reasy for a store manager to add these ‘sensor elements’ (tiles) to his anayltics view. Once a tile is claimed, the analytics data will not be accessible to anybody else than the person that claimed it.

So… yes, it’s still quite a long way to go.  I’ll try to update you once we make some progress. But let me know what feedback you have. Just tweet me or leave a comment directly on this blog.