General Update & Final Architecture Diagram for Expose

It’s been a while since I wrote about expose, but finally I am sitting at the Munich airport again, which is my favorite time to write blog posts. From a technical point of view, expose is in the final phase of being polished. We’ve worked with the designers at SNK to create great user interfaces, ironed out a few bugs here and there and are currently thinking of two showrooms (Munich and New York) to install this prototype. While these discussions and the details will need a few more weeks, I think technically this prototype is locked-down and done. So it’s time to take a final look at it and wrap it all up.

So again, what is it all about? 

There’s two perspectives that we can take. The technical and the business perspective.

From a technical point of view, expose is a Hybris Labs Experiment that combines RFID and IoT technology together YaaS. The location subsystem constantly scans for registered RFID labels and tries to determine the best possible location. The data is analyzed and yields an individual journey per user of the system as well as overall location analyticsThe action subsystem allows individual participants of this experiment to interact on a 1:1 basis with the system. Current action stations include the signup station, the bar station, the journey station and the party booth – all four offer personalized interactions based on the customer’s previous behavior.

Business-wise, such a system can for example be used at events and showrooms. Via the technical means described above, we can track the location of RFID labels, which could be attached to a visitor’s event badge. From an event-coordinator’s perspective, real-time analytics where people are, where people are likely to go and what they do (action subsystem) can be offered. While the backend-users of such a system gain insights into their event and the flow of users, there’s something in for the visitors that carry the RFID labels, too. They can interact at various action points on a one-to-one basis. This means the barkeeper will remember your name and favorite drink, the event host might be able recommend people to meet based on your interests or location history, etc.

As I am a technical guy, let’s concentrate on the technical a architecture diagram first- have a look:

expose :: technical architecture

expose :: technical architecture

From bottom to top, you can see these layers of the expose system:

  • The very basis of the system are micro-services powered by YaaS – Hybris as a Service. This prototype uses quite a lot, so after registering a user, we’ve got a new customer account via the customer service, the products at the bar of course are maintained via the product service. A purchase / order results in a cart being checked out for the customer.
  • Once again, we’ve extended YaaS with custom micro-services for expose. Our RFID readers send HTTP Post requests in regular 3s intervals and the endpoint for those are part of the expose service – part of the expose package. To be brutally honest with you, at this point the configuration is rather static within this service, but at a later stage we could manage it on a tenant-by-tenant basis via a builder module. Totally accurate though in the above diagram are the user interfaces which are rendered by the expose service.
  • We’re now touching the physical world, with RFID readers installed at various locations of a showroom and at the action stations where users can interact on a 1:1 basis. Our default setup will use 5 locations (Impinj Speedway Connect readers) and 4 action points. The latter are Raspberry Pis which my colleague Lars Gregori extended with a custom shield. We attach a small antenna to them so users can put their RFID label on top to have it read. The location readers are constantly sending the scanned RFID labels to our service, where we process the location information with some self-made algorithm and store the data in the YaaS document storage.

 

The map and dashboard

 

 

 

The 4 action point UIs – signup, bar, journey and party booth.

Signup

   

Bar

 

Journey

 

 

An extra paragraph on the party booth

The party booth will be an awesome action point of the expose system and it’s a bit crazy, I agree. I have to think of a nice way of showing you the UI’s, so give me a few days after my current trip to get that done. It shows how we can interact with visitors on a 1:1 basis with the help of YaaS. It will load the data that a visitor left at signup and create a personalized party experience. At the moment, we’ve specified how the party booth will roughly look like and a local artist, Andreas Kraeftner from Munich is working on the physical fabrication. We use metal, glass, wood and it all will be combined with the electronics like Raspberry PIs, LEDs, loudspeakers, a disco ball and a fog machine. The booth will take pictures via a camera connected to the raspberry pi within the booth and it will create an animated GIF in the end that users can post on twitter.

So yes, it’s crazy. And different to many showcases you’ve seen before. It’s okay to be different!

 

Bullseye partially open-sourced – have a look!

Since we introduced Bullseye, a Hybris-as-a-Service (YaaS) based prototype around in-store customer engagement & commerce, the first time at the Hybris Summit ’16 in Munich, we’ve been showing and replicating it across the globe like crazy. We’ve even had companies like BASF do public trials in their stores and just as I write these sentences, we’ve signed up showrooms in Singapore and Thailand. It’s a truly global prototype, highly flexible in terms of the configuration and running on our beloved YaaS infrastructure in the cloud.

While the software-parts of this prototype (below is an architecture to help you remember) are easy to scale, we’ve had quite some challenges to scale the hardware. Our platforms – containing a small microcontroller, a light sensor and a LED ring – are hand-made, hand-soldered, each with a 3D-printed case which alone takes about 4 hours to print in a decent quality. We’ve created many of these platforms ourselves, spending days and weeks making new platforms for new prototype installations somewhere on this globe.

candy-shop-90x60While we’ve been successful in finding a local electronics engineering company that produced these platforms for several projects already, the platforms still needed to come to our desks to be flashed with the correct firmware and initialized. We’ve so far not been able to outsource these parts, as there’s software involved that we could not easily just hand over to them.

That’s changed now! We’ve successfully  open-sourced all the hardware-facing parts of ourBullseye prototype: take a look at the plat GitHub page! This will greatly facilitate the production of platforms in the future, as the hardware & software of the platforms is now completely available for others. It would also be cool to see variations – we’ve used a light sensor and an LED ring in our platform, but you could easily swap that for other sensors and actuators!

In the end, our new open source project is a great blueprint for connected devices. It will not fit for all use cases of course, but I could well imagine that it works for a lot ideas that people have. Here are a few ideas what you can do/learn with this project:

  • Figure out how we reliably connect a Raspberry PIs to the cloud via MQTT and node.js, upon booting the device
  • Figure out how to send data from the Raspberry PI to connected/wired platforms via USB, potentially with USB hubs in between to scale the number of platforms connected
  • Figure out how to write a serial protocol to collect events from the platforms or send commands to them

Have a look, clone the repo, try it out! After all: Have Fun!

 

An update on expose: now adding a party booth

Finally an update on the latest developments around expose, our location / action tracking prototype that we develop on top of YaaS. You might remember that we track the location of RFID labels via the location readers. Besides locating the labels, we also have developed an “action reader” subsystem that is used to engage with the user of the RFID label on a 1:1 basis. For the action readers, the user has to actively place his RFID label close to a small matchbox antenna to be scanned. Below is the updated system architecture: Expose Technical Architecture (1)

While the architecture / framework for all action readers is the same (they send their scanned labels to a common backend API), we reference the correct screen that is intended to be shown in the tablet screens based on the specific MQTT topics that are used. The action readers post to the backend including the tenant/reader Id information which will forward the data to the appropriate screen, connected via Socket.IO.

Right now we have completed these action reader setups:

  • signup: a kiosk where new users with fresh RFID labels are onboarded or may change their data
  • bar: a kiosk where either an employee or a barkeeper can log some drink that he takes out of the fridge.
  • party: a party booth that allows you to have a personalized party based on the data that we know about the user.

For this post, I wanted to specifically pick the party action reader system. It consists of:

  • an action reader that is tied to the
  • tablet screen for the party booth and
  • the party booth itself

The action reader system looks like this:

Expose Action Reader - Technical (1)

The real fun comes in when you look at the party booth. It’s a pretty nice system with a raspberry PI at its core.

Expose Action Station - Party Booth Technical (1)

 

Right now, the booth looks rough 🙂 But we’re in discussions with a local artist to create some booth building/boxing around this. It’s already a lot of fun to use, believe me!

IMG_20161125_134002 IMG_20161125_134000

The software of this system is running on node.js, starting automatically upon booth and so far quite stable. The sequence for using the booth is this:

  • a new user comes into  the booth and holds his RFID label close to the action scanner
  • the tablet screen (here: our TV screen) is showing a welcome message and the color and music choice of the customer. These data are left by the user during the onboarding/signup process
  • the party booth raspberry pi will start playing back the music according to the profile.
  • the dotstar LEDs are colored according to the profile – in combination with the rotating disco ball, this creates a nice atmosphere in the booth later
  • the fog machine turns on for a few seconds, so the bottom of the box will be filled with fog
  • while the user is in the booth and the music is playing, pictures are taken via the raspberry pi camera. These pics appear in real-time on the tablet screen.
  • once the party is over, all pics are aggregated into an animated gif and again shared to the tablet screen.
  • the user can now select one image an it will be shared to the ylabsparty twitter account. have a look it’s already pretty cool!

All right – good for now, ready for the weekend. I hope to update you soon again, till then follow us via the ylabsparty twitter account!

Expose: RFID-based location tracking

Internally code-named “expose”, we’re working on a new Hybris Labs experiment that is finally big and good enough to be blogged about. While it has been maturing for a few weeks now, it’s still in it’s infancy and many parts will prosper over time. What is it about? It’s all about using RFID readers to track the physical location of RFID tags (associated to opt-in colleagues) at our office in Munich. The proximity to the YaaS (hYbris As A Service) teams allowed us to create the complete architecture based on this platform and I am really thrilled to give you an idea about it with this blog post.

The system is currently still small, but also big enough to make technical sense. We have a setup of up to 5 RFID readers which are mapped against 5 locations (POIs, point of interest). These are two meeting rooms and 3 “areas” in the Hybris Munich office.

rfid

Before I go into detail, I’d love to cover the topic of privacy/security/etc. Are we saying that we would like to equip future customers with RFID labels to track them? No, we are not. This is an experiment, yielding a general-purpose location tracking API and means to process these events. RFID is a technology that we’ll likely use at events to generate these location-events in a very quick fashion. It brings the demos to life, as we’ll be able to consume many, many events over a short time. So keep that in mind.

The big picture

Just for the purpose of this post, I created the first architecture diagram. Let me step through the architecture, so you get a good understanding.

Expose Technical Architecture

At the very top, we have the RFID tags (sometimes called RFID labels) that have tiny micro-controllers and larger antennas around them. Those are the items we track. Each of these have unique IDs in their memory. They are passive, meaning that they don’t have a power supply of their own – they are powered by the energy that the RFID antennas in the next layer supply to them.  Up to 4 antennas can connect to a RFID reader, which is the “edge computing” element if you like. The RFID reader constantly scans for tags via the antennas and sends HTTPS POST requests to our back end services every 3 seconds. The data that these requests include is pretty basic: the reader’s MAC address and essentially a list of tags and the antenna (the port of the reader) that scanned the tag. Our RFID readers are from Impinj, a partner of SAP Hybris that we have used in the past for prototypes such as the Changing Room. Our main REST endpoint, the expose service,  is a node.js based cloudfoundry web app which is “YaaS-ified” by registering it as a YaaS service. This will later allow us on to setup security via OAuth2, metering, billing, etc… Another part that is currently purely fictional (as it does not yet exist) is the expose builder. The builder component will later allow a business user to administrate the setup. We’ve designed the whole system to be tenant-aware, which makes it easier to reuse for other events and purposes. Below the custom components (the expose service and builder module) you’ll find many core YaaS services that we are currently using: OAuth2 for getting access tokens to tenant-specific services such as documents (location history), customers (each RFID tag is associated to a customer object) and so on.

In a nutshell, we created a RFID-based location tracking system, which tries to be as technology-independent as possible. We’ve worked hard on the algorithms that try to determine the location of the RFID tags even though multiple readers scan the tags and send these requests with a few seconds offset to the back end service.  Our system is based on the rules of simplicity and honesty. We acknowledge that RFID has its shortcomings when it comes to tracking locations, for example we simply cannot scan tags that are “hidden” behind bodies. Therefore, the location that we emit will have an quality indication such as being

  • “fresh, insecure, just one-time scanned”
  • “safe, constantly scanned for several seconds” and
  • “stale, no more updates for this tag for some time”

Some Screenshots

Here are some early screenshots before beautification by our artists in residence (SNK).

zones

The zones UI (above) is a real-time view into the location tracking system. Updated via a socket.io connection, every second it shows all tags with associated customer accounts and their location state. Looks like I’ve been at Labs some time (therefore “safe” location), Agnieszka was “freshly scanned” at the cafe in the 4th floor and Ulf also just walked into the kleve meeting room.

map

Another view option is the map view. Terribly ugly right now, but technically already showing the data. We’ll change the UI soon and the concept behind the map – probably it will have the character of a heat map.

analytics_locations

Our first analytics UI will show the safe locations over the last 24 hours. It gives you an idea which areas are most frequented.

journey

Probably the roughest of the UIs (but hey, we have nothing to hide, have we?) is the journey UI. Per user, you can see the safe locations in a journey view (and again just the last 24h).

 What’s next?

While we already have a few UIs, the main focus at this point has to be on the technical aspects and to make the core RFID system run really well. We need to properly test the system with our colleagues and continue to find and fix bugs. But when it comes to big features that are currently in our heads, these are:

  • support for so-called “action readers” that will be tied to a location, but also to a specific action. In terms of an event, this might mean actions performed at the reception (new user signup) or the bar (2 cappuccinos). These actions are part of the journey elements and will probably be visualized in such a UI. This will also bring up interesting integrations with other YaaS core services such as the cart. As we already have customer accounts, this should be easy.
  • integration with SAP Hybris Profile to be able to do things like: “customer’s similar to you also visited these locations”, etc.
  •  once we’re stable enough: optimized UI’s – kiosk-style location maps etc. for the events. Also: a builder module for easy configuration of the system per tenant.

Please help us! Carry the RFID tags, check the UI’s provided to see if it makes sense, talk to us if you have questions!

And about 500 Customer Accounts later…

The #HybrisSummit is over, it was a blast. As our bullseye prototype is based on YaaS, we’ve collected a lot of data and I quickly wanted to share that with you. So here we go: we’ve had >500 customer accounts created &  >1200 product liftups in just two days! That means >500 demos that we’ve given, very often more have been watching of course!

IMG_20160216_111405

Screen Shot 2016-02-18 at 8.42.01 PM Screen Shot 2016-02-18 at 8.35.15 PM Screen Shot 2016-02-18 at 8.34.57 PM

If you look a bit deeper, you’ll see that the top product was the “Toucan”, our very colorful and sweet/medium style candy. The data that we’ve collected for the profile questionnaires is also really interesting. It will help us with upcoming orders: about 50% of all customers answer they like sweet candy, followed by sour and salty.

Hope you like it, please reshare & comment as you like!

Revised Bullseye Architecture

I just wanted to share our latest, beautified arch diagram with you. Have a look below – all major components of the bullseye prototype are part of it. One new addition is the sytem configurator view, which is represented via the builder module. I’ll try to write another post soon about more tech changes.

candy-shop-90x60

Bullseye Technical Architecture

Bullseye – in-store targeting and analytics – an update

Yes, it’s more than  a month ago since the last update. The assumption was right – tons of work was ahead of us. And still things need to be optimized, beautified and fixed… but an end is in sight! We’ve made huge progress… we got products, we got a name, we overhauled the technology and worked with the designers to beautify the whole prototype. All while maintaining full YaaS compatibility and flexibility.

IMG_20151214_144959#1 – the name:  bullseye. We think it’s great for a prototype about in-store targeting and analytics.

#2 – the products: we switched from perfume to candy. This morning, I got the confirmation for >180kg of candy delivered next week to the hybris labs premises here in Munich 🙂

#3 why do we need all that candy? At this point, we also got full confirmation to make this prototype the central piece of art/technology at the hybris summit 2016. We’re running our recommendation and analytics system for two days at this event. If you can, please stop by!

In case you’re completely unsure what this is all about, here’s a brief summary – directly form the documentation that I’ve written yesterday.

Screen Shot 2016-01-15 at 3.37.30 PM

So let’s take a more technical view on the updates. Below is the current arch diagram, also soon to be even more shiny. For now, all technical goodness is on it. Let’s step through each part.

Bullseye - plat Technical Architecture (1)

platforms

bullseye is a YAAS-based in-store product recommendation and analytics system. The end-user facing and visible components are the platforms (here numbered from 1-8) that can be programmed via commands in various ways. For product selection, for example, the platforms may receive color events. The platforms also contain one or multiple sensors whose values can be requested. For power supply and communication, the platforms are connected to a base station via a standard Micro USB cable. As a typical IoT edge device such as a Raspberry PI has only a limited number of USB ports and limited power supply, standard USB 2.0 hubs are used in between the base stations USB ports and the platforms.

Recent updates here, not including bug fixes:

  • We’ve optimized the serial communication – previously all commands sent to the platforms (serial, byte-based communication) responded with an event that repeated the data for confirmation. We’re only responding with a JSON-formatted response in case the command triggered an EEPROM update or a sensor reading.
  • We’ve implemented new commands, mainly for light effects. The platforms can now flash in RGB colors (random flashing, simulates a “thinking” system” and a few other color effects.

Next, the base station – that’s where the platforms connect to via USB cable:

The base station is acting as a gateway between the platforms and the internet. It connects to the internet via the IoT standard protocol MQTT and to the individual platforms via a serial, byte-based communication protocol. The base station is subscribed to a MQTT topic for the base station itself and issues commands that are sent to this topic to all connected platforms. At the same time, it is also subscribed to individual topics for each platform – this allows each connected platform to be addressed individually. The central communication broker in this system is a MQTT broker. It is the essential element that connects the base stations to the internet and allows the remaining bullseye system to send commands to the platforms and to receive events from them.

Recent changes:

  • We can now address the base station itself with a dedicated MQTT topic and the base station forwards the command to all connected platforms. This minimizes the network load and is great in combination with the “flash” command. We use this command to simulate a thinking system, right before the results are presented with individual lit-up platforms.
  • MQTT reconnect behavior: we’v fixed a major bug around the reconnect behavior. From time to time, our MQTT connection is closed due to network issues. We’re now able to subscribe to all platform topics so that the system stays fully functional.
  • Our base stations are now Raspberry PIs and we use the excellent forever-service to start our node process in a “forever” fashion. Even if the process is killed, forever restarts it immediately.

Let’s move on to the bullseye YaaS package:

The bullseye package is part of the YAAS marketplace and can be subscribed to by a client project. The package contains the bullseye service, offering various UIs for the end user and retailer, and also the central matching service.  The matching service is a completely tenant-aware service, that uses the profile input from a customer questionnaire to score a selection of products. The result is a scored list of products that are mapped to platform IDs and then addressed with a color command over MQTT. This results in a physical selection of products based on a previous product matching algorithm. The bullseye service also contains an internal analytics module that is powering various analytics UIs. The bullseye builder module is used by a client project to configure the bullseye system. Typical configuration includes the mapping of products to platforms and the setup of a customer-facing questionnaire with scoring information for each correctly answered question.

Changes, well, tons. Let’s see:

  • The matching service is the central component, taking the profile info and matching it with products based on the questionnaire and scoring information. We’ve now implemented a proper blocking behavior – a user enters a session and operates the base/shelf alone for 30 seconds. If he chooses to keep using the shelf, the session is extended.
  • We’ve created 3 analytics screens that connect to real-time data via socket.io channels. All analytics is in memory, which is OK for a prototype. It’s nicely part of a single node.js library and could easily be persisted. Below are some of the analytics screens. More later on when the screens have been completely redesigned.
  • The questionnaire / form UI is already beautified. in the pics below, you see the result. It works excellently on desktop/tablet/phone, fully responsive. Form resubmission for flashing effects and the back button to play again complete the changes here.
  • A product info screen, connected to live product liftups, has been added.
  • A randomizer feature will perform a slideshow among the 3 analytics and product info screens. For events, we’re able to launch it and it just runs and shows all screens over time.
  • More… but this is getting too long…

So – you probably want to see some pictures, right? See below. What’s next? While a lot has been done, we’re still working on finishing touches. Potentially we’ll save the product results to a customer’s cart – so when she opens the cart later at home or at the event, the cart is prefilled with the matching products. We also need to take care of all kind of UI related small issues and we need to make sure the logistic for the hybris summit are taken care of. We’re creating an amazing construction, a pole-like art installation, together with our booth builders here in Munich. Stay tuned!

The state of the end-user questionnaire UI:

Screen Shot 2016-01-15 at 3.19.21 PM Screen Shot 2016-01-15 at 3.19.33 PMScreen Shot 2016-01-15 at 3.19.25 PM  Screen Shot 2016-01-15 at 3.19.40 PM

The state of the builder module:

Screen Shot 2016-01-15 at 3.21.16 PM Screen Shot 2016-01-15 at 3.21.25 PM Screen Shot 2016-01-15 at 3.21.39 PM Screen Shot 2016-01-15 at 3.21.52 PM

And finally some of the early analytics screens:

Screen Shot 2016-01-15 at 4.15.48 PM Screen Shot 2016-01-15 at 4.15.54 PM Screen Shot 2016-01-15 at 4.16.12 PM Screen Shot 2016-01-15 at 4.16.22 PM

In-Store Targeting and Analytics – on YaaS!

It’s finally time to write about a new project we’re working on. Hopefully this also helps to clear up a few open issues we’re still working on. So here’s some news about a project we’ll probably name “bullseye”. To some degree it is an extension of the wine shelf. But it’s super flexible in terms of configuration and products. And – boom – it’s almost 100% based on YaaS, the new hybris commerce APIs.

Architecture, rough… 

This architecture is rough and can change any moment, but it’s a good ground to describe what this is about. The idea itself – again – is about selecting products in the physical retail space. And also about providing feedback to the retailer about physical interactions with products. YaaS plays a big role as we use the YaaS Builder Module system to edit all the configuration of the system. We’ve also written our own YaaS service, that provides the product matching logic in a completely tenant-aware fashion.

Bullseye - plat Technical Architecture (1)

Platforms and Bases = Smart Shelf

From a technical perspective, the hardware used is less impressive. It’s really not the focus this time. We’ve worked on a 3D-printable design that contains the electronics for the hardware parts of this prototype. Each of the platforms below (so far we have about 20 fully working platforms) contains a microcontroller for the logic, a large 24 NeoPixel LED ring (output) and a LDR (light dependent resistor, input). The platforms connect via Micro-USB to a base (power, serial data), which most likely will be a Raspberry PI again. In between, we need standard USB 2.0 hub, as  a Raspberry PI has only 4 USB ports and we would like to power as many as 20 or 30 platforms from one base. Check out some images below.

IMG_20151210_101416 IMG_20151210_103701
IMG_20151021_153508 IMG_20151210_103708

The firmware that runs on the platforms is able to receive a few commands over a custom serial protocol. Via this protocol, we can change the identity of the platforms (stored in EEPROM), read the sensor value or issue a light effect command (e.g. turn all pixels on, turn them red). It’s a fairly low-level, basic, communication protocol. The only business-level logic that so far still runs on the microcontrollers is the calculation of liftup times. We count the duration between the increase of light (product lifted) and the decrease of light (product down). To not interfere with the NeoPixel (light) ring, we’re blocking the event calculation during the light effect execution.

The bases, most likely Raspberry PIs, each have a unique ID. The platforms, again, have unique IDs. Via MQTT (node.js using MQTT Client Software) we can issue commands to the bases and to the platforms directly.

MQTT Broker

An important architecture component that we can’t live without is the MQTT broker. Due to port restrictions and other technical issues, this part is currently outside of the YaaS cloud. For now, the bases connect to the broker to connect the platforms over serial. The bases subscribe to MQTT topics that match the platform ids. They also subscribe to a base-level topic, so we can send base-wide commands. If a platform disconnects from a base, we unsubscribe from the MQTT topic of that platform. This ensures that the communication bandwidth required is lightweight.

YAAS Builder Module

The builder module that you get once you subscribe to our package in the YaaS Marketplace allows you to configure the physical mapping and the questionnaire that the end-user finally gets to see. The products derive from the products you’ve configured via the YaaS product service. Below are a few honest screenshots, before we even started styling these screens (be kind!).

As a user, you’ll first have to choose a shelf, which is identified by the id of the base. Next, you choose which product category you’re creating the recommendation system for. All products of the shelf need to adhere to a common set of attributes, hence the category. Third, you’ll assign the products of that shelf/category combination to platform IDs. Finally, the scoring configuration – which questions, which answers, which score per correct answer is specified. The scoring configuration is the key ingredient to the end-user questionnaire form. Once all four steps are completed, the retailer is given an end-user URL that can be turned into a shelf-specific QR code (or put onto an NFC tag, or put onto a physical beacon or shortened and printed, etc.).

Screen Shot 2015-12-10 at 11.37.10 AM Screen Shot 2015-12-10 at 11.37.13 AM
Screen Shot 2015-12-10 at 11.37.17 AM Screen Shot 2015-12-10 at 11.37.35 AM

YaaS Matching Service

Our matching service is triggered by a special URL that goes through the YaaS API proxy. All requests and bandwidth is counted and can later be billed. The end-user experience begins with a rendering of the questionnaire. The user chooses his answers and sends the data off to the matching service. The matching service now pulls the scoring configuration, the products and the mapping to calculate the matches. Based on the relative threshold, we calculate which products and therefore physical platforms are highlighted. Now, MQTT messages are sent out to the bases/platforms to highlight the appropriate platforms.

Screen Shot 2015-12-10 at 1.08.54 PM  Screen Shot 2015-12-10 at 1.09.03 PM

Once a customer uses the system via a questionnaire, the shelf belongs to her for the next moments. This means we block access to the tenant/shelf combination for some time. During that time, the user is interacting in a personalized session with the shelf. Lifting up a product results in the display of detailed information directly on the customer’s tablet or smartphone. And of course, it fuels a few analytics displays that still need to be detailed.

What’s next? tons of work ahead.

We’re working hard on the specs for the initial version of this prototype and some sample products, categories, configuration that we’ll use for the hybris customer and partner days in Munich 2016 (early February 2016). But we’re also thinking of a few extra features that might make it into the prototype by then: for example, we’re thinking of a stocking mode, in which the platforms highlight one after each other and the screen shows you the product that needs to be on. It helps both the labs member to setup a demo as well as the retail employee to stock a shelf. And we’re thinking of sending the recommended products via email. A customer could then continue the shopping at home which a pre-filled cart.

Got ideas? Let us know. This is the time to provide input!

Funky Retail Trial at the LMU Campus Store – it works

We’ve been trying to validate the effectiveness of our prototypes around the connected retail store for quite some time. We knew of course, that many of our prototypes attract the attention of the customer, because we’ve seen people reacting positively at so many conferences. But getting hard evidence, real numbers, is hard.

Therefore, we’re super happy to report that finally, with the help of the University of Munich, we’re getting a step closer. For one week, we installed 4 of the 5 Funky Retail boxes at the campus store of the University of Munich. Chinmay, from the university, set up a plan to help us collect the right data. We placed two different products (2 each) on the 4 boxes. One box lit up whenever a customer came close, while the other box with exactly the same product on top did not. So far we’ve analyzed the presence events and the lift-ups of products. During the next weeks, we’ll also be getting the sales data for the products.

IMG_20151026_112209These are the results so far:

  • During that one week (5 working days) we collected around 3000 data points, that’s individual presence and lift events.
  • The boxes that lit up when a customer came close, resulted in roughly twice as many lift events. That means, customers were attracted by the light effect and some of them lifted the product to take a closer look.
  • When a shopping assistant was close, customers were less likely to interact with the boxes. That’s a really interesting finding. Shopping assistants seem to disturb the natural curiosity of the customers.

The above results finally give us some early proof that experimenting with light in the retail space is not just fun and cool, but also has a real effect on customer interactions and – potentially – sales.

Let us know what you think in the comments or tweet me directly!

Coding for Kids – our codeweek participation

The last two days, Lars and me have been on a special mission. Powered by codeweek.eu and our internal hybris/SAP CSR program, we educated about 10 kids at a school south of Munich about the basics of programming. We chose the open source and kid-friendly Kano OS for this and it was really an excellent choice.

IMG_20151014_150351

Our workshop began by us explaining the Raspberry PI and how to put it together. The kids worked in teams of two and put it all together by themselves. Most of them already knew what the different ports are used for, and really needed little help.
IMG_20151014_134205

We worked completely offline the first day, which was a good choice in retrospective. There are ways to play together on Kano OS, which is nice, but it can also be a source of distraction. So we got them started with exploring the command line (Game: Snake) and visual programming via the Game Pong on Day one. A few kids also explored the Make Art program which let’s them code a picture – it’s a more creative way to start coding.

IMG_20151015_145957

At the end of day two, some kids had started to code. You can see a simple minecraft script above. We discussed when coding is preferable to crafting and when it’s the other way around.

IMG_20151015_151500

To chill the classroom down at the end of a group minecraft session on day two, the teacher put this Raspberry PI in the middle and everybody had to visualize being in the Raspberry PI. After breathing deeply for a couple of times we all were super relaxed again. We then started to collect feedback – on Lars’ back , because that’s more fun!

IMG_20151015_153233

Many thx to all involved – the school, the teachers, SAP/hybris for letting Lars and me do this! I am sure that we’re gonna have some Raspberry PIs under the Christmas tree this year.