General Update & Final Architecture Diagram for Expose

It’s been a while since I wrote about expose, but finally I am sitting at the Munich airport again, which is my favorite time to write blog posts. From a technical point of view, expose is in the final phase of being polished. We’ve worked with the designers at SNK to create great user interfaces, ironed out a few bugs here and there and are currently thinking of two showrooms (Munich and New York) to install this prototype. While these discussions and the details will need a few more weeks, I think technically this prototype is locked-down and done. So it’s time to take a final look at it and wrap it all up. Continue reading

Bullseye partially open-sourced – have a look!

Since we introduced Bullseye, a Hybris-as-a-Service (YaaS) based prototype around in-store customer engagement & commerce, the first time at the Hybris Summit ’16 in Munich, we’ve been showing and replicating it across the globe like crazy. We’ve even had companies like BASF do public trials in their stores and just as I write these sentences, we’ve signed up showrooms in Singapore and Thailand. It’s a truly global prototype, highly flexible in terms of the configuration and running on our beloved YaaS infrastructure in the cloud.

While the software-parts of this prototype (below is an architecture to help you remember) are easy to scale, we’ve had quite some challenges to scale the hardware. Our platforms – containing a small microcontroller, a light sensor and a LED ring – are hand-made, hand-soldered, each with a 3D-printed case which alone takes about 4 hours to print in a decent quality. We’ve created many of these platforms ourselves, spending days and weeks making new platforms for new prototype installations somewhere on this globe.

candy-shop-90x60While we’ve been successful in finding a local electronics engineering company that produced these platforms for several projects already, the platforms still needed to come to our desks to be flashed with the correct firmware and initialized. We’ve so far not been able to outsource these parts, as there’s software involved that we could not easily just hand over to them.

That’s changed now! We’ve successfully  open-sourced all the hardware-facing parts of ourBullseye prototype: take a look at the plat GitHub page! This will greatly facilitate the production of platforms in the future, as the hardware & software of the platforms is now completely available for others. It would also be cool to see variations – we’ve used a light sensor and an LED ring in our platform, but you could easily swap that for other sensors and actuators!

In the end, our new open source project is a great blueprint for connected devices. It will not fit for all use cases of course, but I could well imagine that it works for a lot ideas that people have. Here are a few ideas what you can do/learn with this project:

  • Figure out how we reliably connect a Raspberry PIs to the cloud via MQTT and node.js, upon booting the device
  • Figure out how to send data from the Raspberry PI to connected/wired platforms via USB, potentially with USB hubs in between to scale the number of platforms connected
  • Figure out how to write a serial protocol to collect events from the platforms or send commands to them

Have a look, clone the repo, try it out! After all: Have Fun!

 

An update on expose: now adding a party booth

Finally an update on the latest developments around expose, our location / action tracking prototype that we develop on top of YaaS. You might remember that we track the location of RFID labels via the location readers. Besides locating the labels, we also have developed an “action reader” subsystem that is used to engage with the user of the RFID label on a 1:1 basis. For the action readers, the user has to actively place his RFID label close to a small matchbox antenna to be scanned. Below is the updated system architecture: Expose Technical Architecture (1)

While the architecture / framework for all action readers is the same (they send their scanned labels to a common backend API), we reference the correct screen that is intended to be shown in the tablet screens based on the specific MQTT topics that are used. The action readers post to the backend including the tenant/reader Id information which will forward the data to the appropriate screen, connected via Socket.IO.

Right now we have completed these action reader setups:

  • signup: a kiosk where new users with fresh RFID labels are onboarded or may change their data
  • bar: a kiosk where either an employee or a barkeeper can log some drink that he takes out of the fridge.
  • party: a party booth that allows you to have a personalized party based on the data that we know about the user.

For this post, I wanted to specifically pick the party action reader system. It consists of:

  • an action reader that is tied to the
  • tablet screen for the party booth and
  • the party booth itself

The action reader system looks like this:

Expose Action Reader - Technical (1)

The real fun comes in when you look at the party booth. It’s a pretty nice system with a raspberry PI at its core.

Expose Action Station - Party Booth Technical (1)

 

Right now, the booth looks rough 🙂 But we’re in discussions with a local artist to create some booth building/boxing around this. It’s already a lot of fun to use, believe me!

IMG_20161125_134002 IMG_20161125_134000

The software of this system is running on node.js, starting automatically upon booth and so far quite stable. The sequence for using the booth is this:

  • a new user comes into  the booth and holds his RFID label close to the action scanner
  • the tablet screen (here: our TV screen) is showing a welcome message and the color and music choice of the customer. These data are left by the user during the onboarding/signup process
  • the party booth raspberry pi will start playing back the music according to the profile.
  • the dotstar LEDs are colored according to the profile – in combination with the rotating disco ball, this creates a nice atmosphere in the booth later
  • the fog machine turns on for a few seconds, so the bottom of the box will be filled with fog
  • while the user is in the booth and the music is playing, pictures are taken via the raspberry pi camera. These pics appear in real-time on the tablet screen.
  • once the party is over, all pics are aggregated into an animated gif and again shared to the tablet screen.
  • the user can now select one image an it will be shared to the ylabsparty twitter account. have a look it’s already pretty cool!

All right – good for now, ready for the weekend. I hope to update you soon again, till then follow us via the ylabsparty twitter account!

Expose: RFID-based location tracking

Internally code-named “expose”, we’re working on a new Hybris Labs experiment that is finally big and good enough to be blogged about. While it has been maturing for a few weeks now, it’s still in it’s infancy and many parts will prosper over time. What is it about? It’s all about using RFID readers to track the physical location of RFID tags (associated to opt-in colleagues) at our office in Munich. The proximity to the YaaS (hYbris As A Service) teams allowed us to create the complete architecture based on this platform and I am really thrilled to give you an idea about it with this blog post.

The system is currently still small, but also big enough to make technical sense. We have a setup of up to 5 RFID readers which are mapped against 5 locations (POIs, point of interest). These are two meeting rooms and 3 “areas” in the Hybris Munich office.

rfid

Before I go into detail, I’d love to cover the topic of privacy/security/etc. Are we saying that we would like to equip future customers with RFID labels to track them? No, we are not. This is an experiment, yielding a general-purpose location tracking API and means to process these events. RFID is a technology that we’ll likely use at events to generate these location-events in a very quick fashion. It brings the demos to life, as we’ll be able to consume many, many events over a short time. So keep that in mind.

The big picture

Just for the purpose of this post, I created the first architecture diagram. Let me step through the architecture, so you get a good understanding.

Expose Technical Architecture

At the very top, we have the RFID tags (sometimes called RFID labels) that have tiny micro-controllers and larger antennas around them. Those are the items we track. Each of these have unique IDs in their memory. They are passive, meaning that they don’t have a power supply of their own – they are powered by the energy that the RFID antennas in the next layer supply to them.  Up to 4 antennas can connect to a RFID reader, which is the “edge computing” element if you like. The RFID reader constantly scans for tags via the antennas and sends HTTPS POST requests to our back end services every 3 seconds. The data that these requests include is pretty basic: the reader’s MAC address and essentially a list of tags and the antenna (the port of the reader) that scanned the tag. Our RFID readers are from Impinj, a partner of SAP Hybris that we have used in the past for prototypes such as the Changing Room. Our main REST endpoint, the expose service,  is a node.js based cloudfoundry web app which is “YaaS-ified” by registering it as a YaaS service. This will later allow us on to setup security via OAuth2, metering, billing, etc… Another part that is currently purely fictional (as it does not yet exist) is the expose builder. The builder component will later allow a business user to administrate the setup. We’ve designed the whole system to be tenant-aware, which makes it easier to reuse for other events and purposes. Below the custom components (the expose service and builder module) you’ll find many core YaaS services that we are currently using: OAuth2 for getting access tokens to tenant-specific services such as documents (location history), customers (each RFID tag is associated to a customer object) and so on.

In a nutshell, we created a RFID-based location tracking system, which tries to be as technology-independent as possible. We’ve worked hard on the algorithms that try to determine the location of the RFID tags even though multiple readers scan the tags and send these requests with a few seconds offset to the back end service.  Our system is based on the rules of simplicity and honesty. We acknowledge that RFID has its shortcomings when it comes to tracking locations, for example we simply cannot scan tags that are “hidden” behind bodies. Therefore, the location that we emit will have an quality indication such as being

  • “fresh, insecure, just one-time scanned”
  • “safe, constantly scanned for several seconds” and
  • “stale, no more updates for this tag for some time”

Some Screenshots

Here are some early screenshots before beautification by our artists in residence (SNK).

zones

The zones UI (above) is a real-time view into the location tracking system. Updated via a socket.io connection, every second it shows all tags with associated customer accounts and their location state. Looks like I’ve been at Labs some time (therefore “safe” location), Agnieszka was “freshly scanned” at the cafe in the 4th floor and Ulf also just walked into the kleve meeting room.

map

Another view option is the map view. Terribly ugly right now, but technically already showing the data. We’ll change the UI soon and the concept behind the map – probably it will have the character of a heat map.

analytics_locations

Our first analytics UI will show the safe locations over the last 24 hours. It gives you an idea which areas are most frequented.

journey

Probably the roughest of the UIs (but hey, we have nothing to hide, have we?) is the journey UI. Per user, you can see the safe locations in a journey view (and again just the last 24h).

 What’s next?

While we already have a few UIs, the main focus at this point has to be on the technical aspects and to make the core RFID system run really well. We need to properly test the system with our colleagues and continue to find and fix bugs. But when it comes to big features that are currently in our heads, these are:

  • support for so-called “action readers” that will be tied to a location, but also to a specific action. In terms of an event, this might mean actions performed at the reception (new user signup) or the bar (2 cappuccinos). These actions are part of the journey elements and will probably be visualized in such a UI. This will also bring up interesting integrations with other YaaS core services such as the cart. As we already have customer accounts, this should be easy.
  • integration with SAP Hybris Profile to be able to do things like: “customer’s similar to you also visited these locations”, etc.
  •  once we’re stable enough: optimized UI’s – kiosk-style location maps etc. for the events. Also: a builder module for easy configuration of the system per tenant.

Please help us! Carry the RFID tags, check the UI’s provided to see if it makes sense, talk to us if you have questions!

And about 500 Customer Accounts later…

The #HybrisSummit is over, it was a blast. As our bullseye prototype is based on YaaS, we’ve collected a lot of data and I quickly wanted to share that with you. So here we go: we’ve had >500 customer accounts created &  >1200 product liftups in just two days! That means >500 demos that we’ve given, very often more have been watching of course!

IMG_20160216_111405

Screen Shot 2016-02-18 at 8.42.01 PM Screen Shot 2016-02-18 at 8.35.15 PM Screen Shot 2016-02-18 at 8.34.57 PM

If you look a bit deeper, you’ll see that the top product was the “Toucan”, our very colorful and sweet/medium style candy. The data that we’ve collected for the profile questionnaires is also really interesting. It will help us with upcoming orders: about 50% of all customers answer they like sweet candy, followed by sour and salty.

Hope you like it, please reshare & comment as you like!