IoT Workshop with Particle Photon

Just a quick update from an internal IoT event here at hybris. Today, we ran our IoT Workshop for the first time and had a hell of fun. We used our carefully crafted IoT Experimentation Kits which included a Particle Photon to educate 15 people about the internet of things. And yes, we’ve connected our first buttons to the YAAS PubSub Service, which was a lot of fun!

Below, you see the box that every participant received, plus a few other pics documenting the fun. At the beginning, we had quite some trouble to get some devices online. We bricked three devices, but in the end everybody was happy and got some YAAS buttons/LEDs connected.

IMG_20150922_092033

We’re now looking forward to seeing some cool creations over the next couple of weeks! Here are a few more impressions:

IMG_20150922_114951 IMG_20150922_114937 IMG_20150922_120348

Creating custom 3D-printed prototype cases with OpenSCAD

The latest prototype I am working on, code-named “infinite cart“, requires us to create a customized case. We’ve worked hard on miniaturizing the electronics – the box below contains a lot:

  • A particle photon  – microcontrollar with built-in Wifi for connectivity
  • A Lipo Battery and charging circuit on our own custom-made PCB
  • A NFC reader
  • A vibration motor for haptic feedback, including the diode/transistor on our own PCB
  • A neopixel strip for visual signalling

Inifite Cart electronics inside

 

It’s so small (at least for us), that the cables within the case became an issue for the first time. Anyway, you can see why we needed to look into custom cases.

For the first 3D printed boxes we used 123D. It’s a 3D designer similar to other 2D CAD tools. While we gained some quick wins initially, one big issue became very quickly apparent: changing a tiny thing in the middle of your workflow is impossible. That means that your 3D design is a sequence of steps and only the last step can easily be changed. Sure, you can “undo” a few steps and recreate them, but it’s really painful. Below, take a look at a few designs we’ve made.

Screen Shot 2015-08-10 at 10.44.49 AM Screen Shot 2015-08-31 at 7.57.36 AM

Besides the issues with flexibility, we also discovered that some 3D printing software printed the shapes in unforeseen ways. For example we added a handle to on of the boxes and it was very loosely combined with the rest of the shape. It looked like the 3D printer still thought of it as seperate shapes…

So finally, we looked into OpenSCAD. It’s a programmatic 3D design tool – so instead of working with your mouse and inaccurately sizeing some shapes on the screen, you code the 3D model. I’ve been a bit scared, because especially for “organic” designs OpenSCAD is said to be hard. So rounded corners, non-straight areas, etc. But see what we’ve made so far:

Screen Shot 2015-08-25 at 2.10.49 PM Screen Shot 2015-08-31 at 8.00.05 AM

It took me an evening to undestand the basics and loose some fear, then the next day I had replicated the early  prototype box. We then started working on the more organic box (see right, e.g. curved side walls, etc.) and right now we have a pretty flexible case aleady. Height, Width, Length is all customized with variables, so changing 3 numbers gives me a new design.

It turns out that this flexibility is worth a lot, at least when it comes to creating cases for prototypes/electronics.

Just a quick update, I think I’ll turn this one here into a detailed instructable just like I did with the Oktoberfest of Things Beer table. So stay tuned!

 

 

Using Particle Webhooks with YAAS PubSub

The hybris YAAS cloud services, soon in public beta, make it really easy to incorporate commerce features into your applications. One of the core services, that developers may use, is the PubSub Service. It is essentially a queue that you can put messages into and someone else can pop the queue to get events out. We thought it would be cool to use the Particle WebHook system to directly fire events into the hybris YAAS PubSub Queue. To make it a bit more visual, we bought a big, blue, push button and 3D-printed a humongous case for it 🙂

IMG_20150731_133629 IMG_20150731_133646

So what needs to be done?

First, we setup a Particle Photon and flashed some custom firmware onto it. We also wired the button and the LED within the button. Whenever you press the button, we send out a button_pressed event via the Spark.publish() function and blink the LED:

At this point, we’re already able to see the events that we publish via the Particle Dashboard.

Screen Shot 2015-07-31 at 2.29.59 PM

Next, we need to create a webhook that publishes the events into a YAAS PubSub Queue. Below is the webhooks.json file that defines the webhook. We use the particle-cli to activate it:

particle webhook create button_webhook.json

One thing to note: to publish an event, you need a Bearer token. We’ve used a little node.js script for that and added the token to the webhook definition like above.

And that’s it. When you press the button, the button_pressed event triggers the webhook and creates a new item in the PubSub publish queue. How to verify? For example via curl:

Et voila, done. We’ve created a simple IoT button that publishes an event to the PubSub queue. From there any subscriber can wire it up with the world.

 

Wearable IoT

Just a quick update on the latest prototype we’ve been working on. In a nutshell, it’s a small, wearable device that a customer can use to scan a product. The product is added to a dynamically created wishlist and later either the customer, the sales agent or both can review the products the customer is interested in and configure the details of the shopping basket. The backend is all YAAS, the new commerce cloud offering from hybris, which later this year will be publicly available. If you’re interested in hacking with us, sign up for our cloud hackfest!

Below you can see the current prototype in action (click the image for the animated GIF).

output_tyvfgB

It’s currently the size of a small box, about 10x7x1.5 cm and we’re pretty confident to shrink-size it even more.  You’ll probably wonder what we packed into the box, so here’s the impressive list of tech for this small box (pic below, including some beer):

  • The box itself is 3D-printed. While we’ve just ordered our labs-own 3D printer, this one was quickly printed via Andi’s  hub from 3DHubs.
  • Power is supplied via a 400 maH LiPo battery. It is wrapped with a QI charging receiver and the charging circuit is part of the custom PCB that we created. That means we can run this prototype for about 1-2 hrs on battery and recharge wirelessly via QI. We can also detect if the whole system is in charging mode – in this case the WiFi connection is dropped for faster charging and the rough charge level is indicated via the number of lid-up NeoPixels.
  • The red square is a mini NFC reader – it connects to the microcontroller via I2C and allows us to scan the NFC tags.
  • Next to the NFC reader is a vibration motor. With each NFC scan, we quickly start the motor to signal the scan. We also light up the NeoPixel strip that is going along the walls of the case.
  • The heart of the prototype and all logic is the custom PCB that houses a Particle Photon. Underneath the Photon (with headers) we have some extra circuits like the LiPo charging circuit, charge detection and voltage approximation and a transistor and diode for the vibration motor. It’s all 0805 packaging, meaning “small” and “hard to solder”.

IMG_20150717_120731

The Particle cloud is calling our YAAS cloud (sign-up here for the private beta of our new cloud offering) via webhooks. The cloud is creating new product wishlists for the customer’s shopping basket and will later also detect special nfc tags to retrieve the full list for further configuration of the wishlist.

IMG_20150717_131451

Let us know what you think in the comments!

Making IoT visual. Using Node-RED for hybrislabs moto.

Our prototype ‘moto’ is just taking an interesting turn. It’s technically pretty robust, we’re finishing off some UI/product choice things and we asked ourselves: that’s it? While we’ve developed one story for each prototype so far, it seems we’re focusing on different, higher-level issues when it comes to IoT with moto. That’s the issue of how things are wired up, how things can interact and how existing configurations can quickly be changed.

So instead of one story, we’ll have many stories for moto. It will stay interesting, from a technical perspective, too. But the real story is: using Node-RED extensions, we’re able to rewire the logic of moto very quickly. And: it’s a tool for business-users. No hardwired setup.

IMG_20150601_173538 (1)

Just like most other IoT-focused prototypes that we have (Wine Shelf, Funky Retail, Tiles), Moto also has a REST-based web API to control the light and motion. As well as a webhooks-based system to communicate with the outer world. But because moto is also built around MQTT (just like funky retail), it is easy to extend. With Node-RED and MQTT, we have a direct hook into the core messaging system of our IoT prototype. And we’re using the extensions we’re currently working on to make the commands and events easy to wire up. Take a look at a very simple example:

Screen Shot 2015-06-05 at 4.08.13 PM

 

Here, a node triggers every 5 seconds. It will first hit the ‘red, slow, counter-clockwise’ node and the connected moto (here #2) will begin to turn red and slowly rotate counter-clockwise. After a delay of 2sec, it will turn green, move clockwise and fast. And finally after another 1 sec delay it turns off the motor and the LEDs turn white. It then starts over again.

We’re just having the first successes with Node-RED, but it looks very promising to become the brain of our prototypes. It might be smart move, as others (e.g. non-technical people, the business guys, the guys with the smart stories) can rewire moto in whatever flavor they want.

Next up in tech are the input nodes, e.g. moto can send presence events that can start interactions.

Just to wrap up, another example where we took the current temperature in Munich, converted that into a moto command and send it to moto for displaying the temperature via light:

Screen Shot 2015-06-03 at 11.12.29 AM

Let us know what you think!

Moto Arch: pretty final

We’re really in the final phase of finishing moto – at least from a technical side. Our friends at SNK (Kathi) has produced an awesome arch poster which I mainly wanted to share with that post:

Screen Shot 2015-05-07 at 9.18.17 AM

To recap: the main purpose of #hybrislabs moto is to figure out how BLE devices in the retail space can be connected to the internet via the employees in the retail space themselves – via an app on a smartphone. This is an interesting topology that we have not touched yet. Soe there is no hub, the smartphone apps of employees will take over that part. The things get connected based on reachability from an employees phone. While we did lot’s of things (including NFC etc) on mobile devices already, we never implemented a true BLE/MQTT gateway as a smarphone app. I am really happy we’ve fixed that.

By the way – have you discovered Bluz on Kickstarter? Same idea. The bluz hardware is connected to the Spark Cloud (which gives RESTful APIs, webhooks, data logging, etc.) via a gateway app which lives on a smartphone.  Really the same idea, our implementation is a bit more narrow and probably not as generic as theirs.

Some more news:

  • Namespaces: the app now operates under a namespace. So all employees of Store X can use the namespace X. That means all connected things will report/forward events/commands under the appropriate namespace to the server. The server now also has namespaced UIs, e.g. a /groups/default path will present the connected motos for that namespace.
  • Webhooks: for each namespace, webhooks can be set up. A webhook is a callback mechanism for all 3rd party or harder to integrate systems that can only speak HTTP. Instead of accessing our MQTT broker directly (which requires port and protocol access, tricky in especially enterpise environments) a webhook can deliver the events from things connected via a HTTP Post request with JSON data as a payload. Works amazingly well and solves most of your integration problems.
  • REST API: while webhooks solve the problem to report events to legacy or third-part HTTP-based systems, the REST API allows other systems to access the thigns, e.g. allows them to send commands down to the device. We’ve created a simple REST API that will forward the requests to the MQTT broker, which then talks to all subscribers (e.g. the smartphoen apps that finally will relay all that to the thigns via BLE).

While I am visiting ThingsCon in Berlin, I hope we’ll have some progress on the web UIs and hardware side to share soon. I expect new hardware next week, as well as an updated web UI. A few more iterations with our awesome friends at SNK and DerGrueneFisch. And we should be ready to show the final version. Enjoy.

Moto Update: the smartphone is now our MQTT/BLE Gateway

It’s time for an update about ‘moto’ – sorry that this did not happen earlier but I’ve been busy with events like #cebit, #iotcon or #internetworld. We’ve now finalized the hardware design and our good friends at DerGrueneFisch are manufacturing a small series of moto prototypes (9 to be exact) in the coming weeks. This also means that I’ve moved on to more software-related challenges instead of hardware challenges.

Moto Architecture Diagram

If you remember the architecture diagram (find it again above), we connect the motos wirelessly via BLE. While I’ve been using some quick & dirty node.js based scripts on my mac for testing the communication over BLE, I’ve now written an Android app that acts as a MQTT/BLE gateway. Powered up, it will launch two services: the BLEService and the MQTTService. These services are started and then continue to run in the background. They are loosely coupled via Android Intents. Right now, we fire up the services when the Android Activity (that’s what is “shown on the screen when an Android app fires up) is shown. And we stop these services again, once the app becomes invisible. This is really convenient for testing, as we tear down/fire up the services a lot which is great for testing.

BLEService
This sticky service (meaning the system may restart it if it was removed due to resource constraints) is scanning for new, non-connected motos and will then try to connect. Once connected, we subscribe to notifications for one BLE characteristic which acts as  the event stream from the hardware. We also save a reference to another identified characteristic that we use to send our commands to. In order to react to commands and be able to forward events, we use Android intents. The BLEService registers listeners for all intents that the MQTTService is sending out, as they contain the moto commands that need to be forwarded to the moto’s. The BLEService also maps the incoming commands to the corresponding motos and – new – now is namespaced. That means the users of the Moto Android App will later be able to choose their namespace so the analytics data is kept separate from others.

MQTTService
For MQTT, we’re using the only Android/Java MQTT client we were able to get: Paho. Although there seems to be an existing Android Service wrapper around the Paho MQTT client, that one is little documented and it really was simpler to create our own service that does exactly what we want it to do. The MQTTService is again sticky and should be running all the time. It tries to keep a constant connection to the MQTT broker that we host on Amazon EC2. It is subscribed to all commands that fall into its namespace, e.g. moto/<namespace>/+/command – which is an MQTT topic with a +wildcard, meaning it will receive messages sent to moto/<namespace>/1/command for example.

Getting MQTT or BLE running on Android alone and for  a small demo is pretty easy. The complexity comes one you try to connect to multiple devices at once, because the Android BLE APIs are synchronous and firing too many BLE requests at once will simply override a few requests sent. So one has to work with a few delays and timers here and there to make sure it really works reliably. The idea is also, that sales agents with the app installed can roam freely and if one is close to the BLE devices, their phone/app will connect transparently. So far, this works realy nicely. After a few seconds outside of the coverage area, the BLEService starts to receive disconnect callbacks and we start removing the moto element from the list of connected ones. This will enable it to be added by another sales agent and his/her device that has the app installed.

The Protocol
At least for now, I’ve also frozen the “protocol”, e.g. which characteristics are used, what data is sent, how it is determined what is possible and what not. First of all, for sending and receiving data from/to the moto elements, I use two seperate BLE characteristics. This simply keeps everything a bit more organized and easier to understand. For sending from the BLE hardware to the smartphone, struct-based events like these are used (this is straight from the Arduino IDE):

Mainly due to issues with setting up multiple BLE notifications from Android at once, I decided to distinguish the two events that I send out via the first byte – see the “eventType” byte which is different for a PresenceData Event and MetaData event.  MetaData Events are sent our regularly to inform the smartphone and the server later that a device is live. We visualize the MetaEvents again via heartbeats. You can tell within 10 seconds if a device is connected or not. The PrenseceData Events are sent whenever the presence state (customer in front/customer lost) changes. Just like with tiles, we also calculate the duration of the presence directly on the device.

For incoming data, so-called moto commands, the protocol is slightly more complex. We distinguish between two broad categories of commands:

  • “standard” commands can change the current RGB colors and the motor state (this includes on/off, direction and speed level of the motor)
  • “special” commands are distinguished from normal commands by the value of the first byte. To be able to extend the command mechanism, they introduce a “subcommand” byte as the second byte. From the third byte on, the special command’s data is sent. Right now I’ve specified a “blink” command that will blink the RGB pixels for a certain duration in a certain color. Another command implemented is a rainbow chase, so the pixels will update according to a color wheel which looks like a rainbow in the end.

Some code example showing how I deal with incoming commands is below:

Android UI Adapter
One last element that got a lot of love from me is a special UI Adapter for the Android app. There’s nothing super special about this data/UI binding, it is just a lot of work. The UI will later try to come close to the action that the moto element is performing: if it blinks, the UI element in the android app will blink, colors will be reflected as well as possible and of course the spinning status will be represented. Once I have a few motos connected at once, I will shoot a few pics and show this to you in an update.

Up next
Now that I have the hardware specced out and a running gateway prototype via the Android App, the next thing that I’ll spend time on is the server side that collects all the data. This will also include a RESTful API to control each moto element, client/server communication via socket.io for the UI and early ideas for the skinning. I hope to receive the first elements of the produced series within 2-3 weeks and will try to update you on the progress made.

 

Moto: exploring the smartphone as an IoT hub for retail

At #hybrislabs, we’ve explored IoT quite a bit now. We’ve begun with the smart wine shelf, our first IoT experience for the retail spaces that used a unique idea and combination of technologies to provide both customer and retailer value. Next up was funky retail, where we focused on the analytics in the retail space with both distance and pressure sensors. With tiles, we went wireless for the first time – but still used a central hub where all Bluetooth LE messages are collected and forwarded to the cloud.

Finally, with moto, we’re now filling a gap. We would like to explore one missing IoT topology in our portfolio: using the smartphone as a hub for the connected devices around you. Below is a pic how the current prototype looks. In the end, it will be a glass-protected, spinning disk that is lighted up from below. It will feature an IR distance sensor to detect customers, be able to change rotation speed and direction as well as the color. It will require a power cable, but communication will again be bluetooth low energy.  Here’s also a video of moto from a recent G+ post.

IMG_20150303_163718

What is more important and sadly almost invisible is *how* we connect these IoT elements. We’ll not use a central hub. Instead, the plan is to have iOS/Android Apps installed on the sales assistants phones that automatically connect to the retailers smart objects. These apps on the smartphones connect via BLE and forward the data to/from the cloud to/from the the things. The idea is that a sales assistant can freely move in the retail space. The app will scan and connect, might loose the connection from time to time and leave one “moto” disconnected, later move back in range and reconnect. If another sales assistant with the same app and configuration moves in range, he will take over.  Here’s the architecture:

Moto Architecture Diagram

At this time, we’ve successfully connected to the moto’s and defined the rough BLE-based protocol that we’ll use. We’ve got some node.js based code that works on a Mac for experimenting and testing. Next up will be the task to write a good Android app (iOS welcome, too), that launches, finds IoT elements, connects and then proxies the communication to the cloud. For the cloud communication, we’ll again use MQTT but still need to find a good and easy MQTT solution for Android/iOS. So if you have any good ideas and are able to point into the right direction, let us know! (@hansamann or comment – we actually do read them!)

To wrap this up, here’s the raw PCB of moto with the neopixel RGB ring and IR distance sensor connected to the PCB. The board again uses a LightBlue Bean for the BLE connectivity. As it is running on 9V for the stepper motor (which is not shown here), we need to step down the voltage twice from 9V – one time to 5V for the neopixel RGB LEDs, another time to 3.3V for the ligthblue bean. We’re also using a stepper motor driver, DRV8834 on a breakout,  that allows us to control the direction and speed of the stepper motor.

IMG_20150303_161139

The Physical Web, Connected Retail and IoT. Some thoughts.

The hybris Summit is just over and the hybris labs team presented many IoT-related prototypes to the customers and partners visiting. If you are following this blog, then that’s no news 🙂 Today, #google was kind enough to send me a few “physical web” beacons and also two extra Intel Edison boards, for all more fancy ideas I might have. After some wine and wild thinking, here are some thoughts.

IMG_20150216_203005The Physical Web
If you’ve never heard this: it’s basically an Apple iBeacon but instead a crazy, cryptic, UUID which is essentially a long number, it sends around a URL to a website. The key thing here is to understand that an iBeacon only makes sense with a special app, that scans and *interprets* the UUID. This could be Estimote’s SDK that tells your APP that Beacon 124123412341324 should right now, actually, mean show a coupon for the white sneakers in the showroom. We stopped believing that every customer would have the retailers app installed that enables commerce-centric use cases with iBeacons a while ago. But scanning QR-codes for URLs, tapping NFC tags, or even typing URLs directly… really? How backwards 🙂

If only every thing would publish a URL
So now, the physical web tries to solve that problem. There will not be an app for everything. In the end, native apps won’t scale. They might be prettier and for some time looked like the only way to do mobile, but it just does not scale to the Internet of Things, where we talk about billions of smart devices. Broken down to commerce, we have not so many unique retailers around the globe compared to the complete IoT. Still, it is unrealistic that every customer walks into the retail space and has the suitable app installed to unlock the next smart wine shelf. The physical web replaces the cryptic data sent via Apple iBeacon with URLs. Only problem: the BLE advertisements are small, so some compression similar to the NFC NDEF URL Records is required. Combined with link shorteners, which are anyways great for built-in analytics, that seems like a solveable problem.

Damn, the physical web needs an app 🙂
Dammit, did I just say it’s unrealistic our customers will each have a dedicated app installed for every single store and the “Things” therein. Right now, the physical web needs an app, that scans and interprets the physical web beacons. The promise is: there will be one app. Ideally, at some point, integrated into the operating systems. Like: your browser. That would be the natural place for such a web scanning feature.

So where will our physical web beacons go?
I’ll touch the Intel Edison “dynamic physical web beacons” over the next days, but first I will attach the 10 web beacons to some objects around the office. We have a few prototypes in the hybris labs space, which each will get one. Like in a museum, each beacon will forward to a unique blog post giving you some context and additional information about the prototype. I wish our fridge, filled with beer, had a beacon so we could track the takeout of a beer and track usage per employee. Oh, one beacon should link to the Swarm (that was Foursquare, remember?) URL for our office, so people can check in easily. Maybe I should carry a web beacon, so whoever is close to me can scan the link to my G+ profile or twitter account, so he can follow me. Next time I give a presentation, I will first update a beacon with the URL to my #prezi presentation and distribute the share links like that.

IMG_20150216_210539

For the Intel-Edison based beacons, I need some constantly updating source so a dynamic beacon makes sense. The latest blog post on the hybris labs blog might make sense on first sight. But after a few extra sips of wine, a simple HTTP redirect – aka the WEB – solves that issue. The lab.hybris.com RSS feed already will redirect you to the latest blog post. So why waste an expensive Intel Edison on this? Reporting a sensor value makes way more sense. If you want to report a sensor reading, to load it directly off the web your sensor needs to share it with the web. Using a smart web beacon, I can send the browser to a local web address, then read the value. My local web address might be a retailer’s analytics system, having beautiful links to all the sensor data in my store. I’ll do that tomorrow or so…. please send us some comments or tweet me directly!

Tiles in the Labs Space, fresh UI

Just before the holidays, we had a prominent visitor to the labs space: Bernd Leukert. For this, we decided to put the current Tiles prototype into the labs space. We’ll switch the bottles (products) that we use most likely, but I wanted to share this quickly with you.

IMG_20141217_131229 IMG_20141217_131223

 

Our friends at SNK also did a great job on the web ui. It is fully responsive, so it looks great to add a tile with your smartphone but you can also get a great overview about all your tiles on a desktop browser.

Screen Shot 2015-01-09 at 10.17.15 AM

 

For the hybris summit ’15, mid-February, we’re now working on our own product designs. We’re currently envisioning to create our own can designs and buy 7 sets of refreshing, caffeinated, drinks with a different design (barcode) each.  I’ll post some graphics once we have them.

!!!Also check out our Tiles video!!!