General Update & Final Architecture Diagram for Expose

It’s been a while since I wrote about expose, but finally I am sitting at the Munich airport again, which is my favorite time to write blog posts. From a technical point of view, expose is in the final phase of being polished. We’ve worked with the designers at SNK to create great user interfaces, ironed out a few bugs here and there and are currently thinking of two showrooms (Munich and New York) to install this prototype. While these discussions and the details will need a few more weeks, I think technically this prototype is locked-down and done. So it’s time to take a final look at it and wrap it all up.

So again, what is it all about? 

There’s two perspectives that we can take. The technical and the business perspective.

From a technical point of view, expose is a Hybris Labs Experiment that combines RFID and IoT technology together YaaS. The location subsystem constantly scans for registered RFID labels and tries to determine the best possible location. The data is analyzed and yields an individual journey per user of the system as well as overall location analyticsThe action subsystem allows individual participants of this experiment to interact on a 1:1 basis with the system. Current action stations include the signup station, the bar station, the journey station and the party booth – all four offer personalized interactions based on the customer’s previous behavior.

Business-wise, such a system can for example be used at events and showrooms. Via the technical means described above, we can track the location of RFID labels, which could be attached to a visitor’s event badge. From an event-coordinator’s perspective, real-time analytics where people are, where people are likely to go and what they do (action subsystem) can be offered. While the backend-users of such a system gain insights into their event and the flow of users, there’s something in for the visitors that carry the RFID labels, too. They can interact at various action points on a one-to-one basis. This means the barkeeper will remember your name and favorite drink, the event host might be able recommend people to meet based on your interests or location history, etc.

As I am a technical guy, let’s concentrate on the technical a architecture diagram first- have a look:

expose :: technical architecture

expose :: technical architecture

From bottom to top, you can see these layers of the expose system:

  • The very basis of the system are micro-services powered by YaaS – Hybris as a Service. This prototype uses quite a lot, so after registering a user, we’ve got a new customer account via the customer service, the products at the bar of course are maintained via the product service. A purchase / order results in a cart being checked out for the customer.
  • Once again, we’ve extended YaaS with custom micro-services for expose. Our RFID readers send HTTP Post requests in regular 3s intervals and the endpoint for those are part of the expose service – part of the expose package. To be brutally honest with you, at this point the configuration is rather static within this service, but at a later stage we could manage it on a tenant-by-tenant basis via a builder module. Totally accurate though in the above diagram are the user interfaces which are rendered by the expose service.
  • We’re now touching the physical world, with RFID readers installed at various locations of a showroom and at the action stations where users can interact on a 1:1 basis. Our default setup will use 5 locations (Impinj Speedway Connect readers) and 4 action points. The latter are Raspberry Pis which my colleague Lars Gregori extended with a custom shield. We attach a small antenna to them so users can put their RFID label on top to have it read. The location readers are constantly sending the scanned RFID labels to our service, where we process the location information with some self-made algorithm and store the data in the YaaS document storage.

 

The map and dashboard

 

 

 

The 4 action point UIs – signup, bar, journey and party booth.

Signup

   

Bar

 

Journey

 

 

An extra paragraph on the party booth

The party booth will be an awesome action point of the expose system and it’s a bit crazy, I agree. I have to think of a nice way of showing you the UI’s, so give me a few days after my current trip to get that done. It shows how we can interact with visitors on a 1:1 basis with the help of YaaS. It will load the data that a visitor left at signup and create a personalized party experience. At the moment, we’ve specified how the party booth will roughly look like and a local artist, Andreas Kraeftner from Munich is working on the physical fabrication. We use metal, glass, wood and it all will be combined with the electronics like Raspberry PIs, LEDs, loudspeakers, a disco ball and a fog machine. The booth will take pictures via a camera connected to the raspberry pi within the booth and it will create an animated GIF in the end that users can post on twitter.

So yes, it’s crazy. And different to many showcases you’ve seen before. It’s okay to be different!

 

An update on expose: now adding a party booth

Finally an update on the latest developments around expose, our location / action tracking prototype that we develop on top of YaaS. You might remember that we track the location of RFID labels via the location readers. Besides locating the labels, we also have developed an “action reader” subsystem that is used to engage with the user of the RFID label on a 1:1 basis. For the action readers, the user has to actively place his RFID label close to a small matchbox antenna to be scanned. Below is the updated system architecture: Expose Technical Architecture (1)

While the architecture / framework for all action readers is the same (they send their scanned labels to a common backend API), we reference the correct screen that is intended to be shown in the tablet screens based on the specific MQTT topics that are used. The action readers post to the backend including the tenant/reader Id information which will forward the data to the appropriate screen, connected via Socket.IO.

Right now we have completed these action reader setups:

  • signup: a kiosk where new users with fresh RFID labels are onboarded or may change their data
  • bar: a kiosk where either an employee or a barkeeper can log some drink that he takes out of the fridge.
  • party: a party booth that allows you to have a personalized party based on the data that we know about the user.

For this post, I wanted to specifically pick the party action reader system. It consists of:

  • an action reader that is tied to the
  • tablet screen for the party booth and
  • the party booth itself

The action reader system looks like this:

Expose Action Reader - Technical (1)

The real fun comes in when you look at the party booth. It’s a pretty nice system with a raspberry PI at its core.

Expose Action Station - Party Booth Technical (1)

 

Right now, the booth looks rough 🙂 But we’re in discussions with a local artist to create some booth building/boxing around this. It’s already a lot of fun to use, believe me!

IMG_20161125_134002 IMG_20161125_134000

The software of this system is running on node.js, starting automatically upon booth and so far quite stable. The sequence for using the booth is this:

  • a new user comes into  the booth and holds his RFID label close to the action scanner
  • the tablet screen (here: our TV screen) is showing a welcome message and the color and music choice of the customer. These data are left by the user during the onboarding/signup process
  • the party booth raspberry pi will start playing back the music according to the profile.
  • the dotstar LEDs are colored according to the profile – in combination with the rotating disco ball, this creates a nice atmosphere in the booth later
  • the fog machine turns on for a few seconds, so the bottom of the box will be filled with fog
  • while the user is in the booth and the music is playing, pictures are taken via the raspberry pi camera. These pics appear in real-time on the tablet screen.
  • once the party is over, all pics are aggregated into an animated gif and again shared to the tablet screen.
  • the user can now select one image an it will be shared to the ylabsparty twitter account. have a look it’s already pretty cool!

All right – good for now, ready for the weekend. I hope to update you soon again, till then follow us via the ylabsparty twitter account!