Augmented Commerce

…Video: Demo in a Minute

hybris labs can do B2B! Okay, you might not identify it as such in the first moment, because it looks as if we’re just playing with lego. But far from it! If you honestly believe we use our valuable working hours to build lego aeroplanes… you’re not absolutely wrong… But we had a very good (commerce related!!) reason to do so, as you will understand shortly. The problem about demoing B2B is that it’s often a bit too big (get it? sorry…). By using lego we can demonstrate the prototype in a closed environment which we’ve created ourselves. We call this a space-efficient demonstration.

augmented_commerce_aproved

Augmented Commerce is an enhancement of our previous augmented reality prototype. The basic idea of recognising objects with your phone or tablet and then directly ordering new parts has stayed the same (we’re also using the same software), but we’ve added some new elements. You can now scan a more complex object, like a (lego) aeroplane, and the application will list all the parts within this object, so the item that needs to be replaced can easily be identified and then chosen. All product information displayed within the app is pulled from the hybris server, just as the ordering process is integrated with the hybris platform. And since we’ve created our own lego Universe, we can really deliver the required part instead of just pretending what would happen next. After the order has been approved by the manager (more B2B stuff), the specific lego piece is immediately delivered in the coolest way you can ever imagine. When you see it you’ll start jumping up and down, clapping your hands like a child in total excitement! Or maybe that’s just us… There’ll be a demo video soon, but you need to experience it live. So pop in at NY82!

augmented-commerce-90x60[3]

Time for a change

Imagine a lady shopping for clothes… no, actually don’t imagine a lady, picture yourself! Because the odds are pretty good you’ve already been in the following situation, or at least in one similar to it. So, let’s start again. You’re shopping for clothes. You’ve already been round town for a while and have had a fairly successful tour so far, with the result that you’re now carrying at a minimum of three bags around with you. Oh, and it’s winter which means you’ll also be wearing a thick jacket. All that’s left on your list is a pair of trousers. You find a pair you quite like, pick your size, walk to the changing room, put down your bags, take off your jacket, take off your shoes, take off your trousers, try on the pair you’ve chosen and… bugger! Wrong size… You now have a few options:

a)    You put back on your trousers and your shoes, take your jacket and your four bags,         look for the right size, and… you know the rest.

b)    You stay as you are, leaving your shoes, your trousers, your jacket and your five                bags in the cabin (hoping that nobody steels your phone and your wallet), and                    then walk through the shop half naked, searching for the right size. Then from the             top…

c)    You’re lucky enough to have a partner who accompanies you on your odyssey. So,          depending on your personality and relationship, you now either stick your head                 out of the cabin, communicate through the closed curtain, or your partner joins                 you in the cabin. In any case your belongings and your six bags are safe.

Does any of that sound remotely familiar? Oh by the way, this is how our new prototype works:

A changing room is equipped with an RFID scanner and a tablet computer. When an item (labelled with an RFID tag) is brought into the changing room, it will almost immediately appear on the tablet. By using the application on the tablet, the customer can choose a different size or colour, and have the selected item ‘delivered’ to the cabin. The customer can also select items to create a wish list which is sent to him via email.

A big thank you to the hybris UI/UX team for taking care of the design and Impinj for assisting us with the hardware!

Screen Shot 2015-01-20 at 4.10.16 PM

Changing Room live on stage at SAP d-kom! See more here!

Tiles in color, plus finalized arch poster

Now that we have all major events (except the hybris xmas party) behind us, we can finally focus on getting a few projects really finalized. Tiles  made huge progress over the last weeks and I just got the fully-colored tiles in, plus I have a finalized architecture poster that I want to share with you. Big kudos go out to Elke and DerGrueneFish, our booth building partners for this and most other projects. The tiles (21 in total, for 3 complete demo sets) are colored in 4 fresh colors for a change (no boring white!). I absolutely love the way they look.  Over one day, I was soldering the first 7 which are currently connected to one hub.

IMG_20141208_114334

For the poster, Kathi at SNK did an awesome job. I already ordered our poster which we’ll then present at the hybris summit 15 in Munich at our booth. Having a descriptive poster will greatly help us to explain the IoT setup for this prototype. Right now we expect to have cans on top of the tiles, so we made that part of the poster.

tiles-90x60

 

 

Just to recap the architecture, have a read:

  • “Tiles” are the wirelessly connected platforms. We use Punchthrough’s LightBlue Bean and remove the battery holder to make the platforms 8mm high. We still use CR2032 batteries, which gives us about 1 week battery life right now. We would get more, but I send our a MetaEvent every 10sec which is hard on the battery.
  • The “Hub” collects all data. It scans for tiles, continuously, and connects. The hub runs on the raspberry pi, uses a BLE dongle (choice is key here) and uses node.js for all programming. It sends on data to the server with CoAP – a UDP-based IoT protocol.
  • The “Server” collects all data for all hubs (yep, there can be many) and provides the necessary APIs for managing the User/Tile association, authentication and authorization (Oauth2 used here), etc.

 

One change over the last days was that we can now associate products with the tiles. That means a store manager can just scan a tile (NFC or QR) and then add this tile to his private analytics page. The UI of these web pages is currently being worked on and will feature a few cool features such as a heartbeat every 10 seconds or the color of the scanned tile, that gets pulled via some static, factory-decided data. This system is all up and running now, currently with one live hub and 7 tiles connected.

14 - 2

What’s left is the callback mechanism plus the web ui. The callback mechanism will “call out” to external systems for each event reveived. So if a LiftEvent is received and a webhook is configured, we’ll send out a HTTP Post to the configured external service. I also plan to pull in the product details from YAAS, hybris’ on demand API offering.

14 - 1

 

Tiles Update – we've added blinky blinky

Our project Tiles, little BLE-connected platforms for customer interaction tracking, is entering a project phase which allows me to blog and inform you a bit more. Since yesterday night, the Raspberry PI and Arduino in the hub uses one power source. This makes the overall design easier. We also have been working on a Raspberry PI B+ hat, using Eagle, to further optimize our design.

One visible change is also that it now blinks 🙂 The hub rotates an LED light to signal the BLE scanning process. It flashes once you liftup the product, well, the apple in this case.

IMG_20141126_204216

 

We’ve now also locked down the architecture and below is a rough sketch that should help understand it. Again, a quick summary below.

tiles technical architecture

  • “Tiles” are the wirelessly connected platforms. We use Punchthrough’s LightBlue Bean and remove the battery holder to make the platforms 8mm high. We still use CR2032 batteries, which gives us about 1 week battery life right now. We would get more, but I send our a MetaEvent every 10sec which is hard on the battery.
  • The “Hub” collects all data. It scans for tiles, continuously, and connects. The hub runs on the raspberry pi, uses a BLE dongle (choice is key here) and uses node.js for all programming. It sends on data to the server with CoAP – a UDP-based IoT protocol.
  • The “Server” collects all data for all hubs (yep, there can be many) and provides the necessary APIs for managing the User/Tile association, authentication and authorization (Oauth2 used here), etc.

One more thing – I’ve connected the server to Xively, a data logging platform. We collect mainly the battery rundown to estimate battery life and also the temperature values from the lightblue beans. At this point I just want to share some nice graphs to show you how much sense it makes to track that data. It will definitely help us to optimize the design / battery consumption further. Right now we stay optimized for demo purposes, but we can later reduce the events sent for example to get a better battery life.

Screen Shot 2014-11-27 at 9.49.19 AM

Screen Shot 2014-11-27 at 9.49.13 AM

 

 

Next up: Tiles

I started blogging bits and pieces about BLE – Bluetooth Low Energy – a few weeks ago. The research we’re doing right now will help us to connect little, battery-powered “tiles” (hence the name) to a hub that collects data from these sensors. This prototype is part of our vision for a connected retail space. We’ve now received a first CNC-milled hardware prototype for the housing, which is shown below.

IMG_20141104_175100

 

We’ll now start connecting the bits and pieces. Our overall architecture consists of 3 parts:

  • the satellites –  tiles – contain LightBlue Beans – little arduino/BLE microcontrollers that run on batteries. The tiles also contain a single pressure sensor that is used to detect if a product is on top (or lifted up). The event data contains information about the tiles ID, the event (up/down) and the battery level. The mechanism we seem to support in the end will use BLE notifications that originate from the LightBlue Bean and will be received by the hub.
  • The hub is also still under development, but some early node.js code works nicely on my Mac already. The hub scans for the tiles, which send BLE advertisements, and connects to them. It receives the events via BLE notfications and will have to manage the tiles and pass on the events. The protocol we would like to use for the first time is CoAP in this case. It is essentially a binary version of HTTP, runs on UDP and is – as the name suggests – made for constrained applications. We’ve then successfully used ZMQ, MQTT and CoAP when it comes to IoT protocols.
  • The server will receive all CoAP messages from the tiles, process and persist (or at least keep) the data and allow users for the system – customers, store managers – to manage the tiles. We intend to print a QR code or attach an NFC tag to each of them. Once you touch the tiles with the NFC tag and have passed the OAuth2-based authorization, you can add the tile to your personal analytics view. The goal is to make it really reasy for a store manager to add these ‘sensor elements’ (tiles) to his anayltics view. Once a tile is claimed, the analytics data will not be accessible to anybody else than the person that claimed it.

So… yes, it’s still quite a long way to go.  I’ll try to update you once we make some progress. But let me know what feedback you have. Just tweet me or leave a comment directly on this blog.

Funky Retail

The next hybris labs prototype is coming soon. You might have read about the idea already here on our blog, but there have been a couple of developments. Funky Retail’s the name, in-store analytics is the game. With the Smart Wine Shelf we aimed to improve  the customer experience by the use of an recommendation system based on IoT technology. But we also realised that the Internet of Things offers ways to enhance in-store analytics. This is exactly what we focused on while designing Funky Retail.

On any standard shopping website, retailers know exactly when a customer visits,  know how long he stays, which products he looks at, for how long he looks at them, can recommend upsells and sees if the customer makes a purchase. Why should this not be possible in the physical retail world? That’s what we evaluate with Funky Retail. We identify the presence of a customer in front of a Funky Box; we count the product lift-ups; we measure how long a product is being lifted; and we even combined the individual product lift-up with the playback of an engaging product video.

To spice the whole thing up a little, we collaborated with the hybris customer Mammut. Mammut equipped us with some cool products that help us to put some more life into this prototype and round the story off. We don’t want to spoil the surprise, but we’ll give you a hint: A video shoot is scheduled with the hybris media team, and those guys have got a bit of climbing experience…

Watch the Funky Interview here!

IMG_0064

The making of the #hybrislabs Oktoberfest of Things table

As you will know, we’re actively contributing to the Oktoberfest of Things. Our smart beer table is now pretty much ready and we are just waiting for some revised beer coasters to arrive. It’s a crazy project, but still has sone commerce background (automatic reordering or process improvements). Still, the focus here is on engaging with the community of course. We’ve shared all our source code and design files via GitHub, check it out here. The code is under GPL and the images are Creative Commons BY-SA. The current features include:

  • double-tap to send a “tap” event. Currently this will highlight the beer mug in the UI, e.g. call the waiter
  • liftup the mug to send the up event. Whenever this happens, the liftup count for that mug is increased
  • put the mug back down – now the new fill level is calculated and animated based on an average  30ml per liftup/sip.

IMG_20140808_164821-EFFECTS

In this post, I’d like to outline the technical architecture for this project. For the first time, we’ve used a Spark Core for one of our projects. The Core’s Wifi reliability has greatly been improved over the last weeks and we’re now able to work with it in our enterprise environment.

Take a look at the architecture below, which is pretty simple for IoT:

architecture

If we start on the left, you can see the beer coasters with integrated pressure sensors. In reality we got 8, which is a good number of people to have around a beer table. The pressure sensors are hooked up to the 8 analog inputs of a Spark Core to figure out of the mug is above the coaster or not. We also use the pressure sensors to listen for the taps. The core then communicates (transparently for us) with the Spark Cloud and our web UI will open a SSE (Server-Sent Events, like a one-way WebSocket connection) via Javascript. All logic to update the UI is currently in JavaScript in the browser, the HMTL page that hosts the Javascript could be a static one.

Let’s take a look at some snippet of firmware that runs on the Spark Core:

In our main loop, that is executed over and over again just like on an Arduino for example, we loop through the 8 mugs. Max Schrupp, on the labs team, greatly improved the code just a few days back and we now detect the taps in a pretty good fashion. For each mug, we determine the current state (up/down) and push that value into an array of states for each mug. We store up to 8 past values for each mug this way. We then check for two things: is it a double tap? and if not: has the value changed (we compare against all values, so all 8 stored values per mug need to be switched).

Sending the events will probably be the most interesting part for you. Luckily this part is easy, thx to SparkDevices:

Here, we check the state of the mug in the states array and then use Spark.publish to send the up or down event. The data we send is a snipped of JSON that fits nicely into the 84 bytes max for the Spark publish function.

Within the website that representes the UI, a small piece of javascript receives these events.

All the magic, like sending the events via a WiFi connection up to the Spark Cloud and down to the web UI via SSE is handled by Spark.

screenshot1

Please do checkout the GitHub project for details and full source code – which includes the web code (HTML, CSS, JS, plus images) and the Spark Core Firmware (see the sparkcore directory). We will most likely improve the code over time, but our next stop is the #iotcon in Berlin, where we will demonstrate the smart wine shelf and our brand new #oktoberfestofthings beer table! I also have two sessions, one about our IOT prototypes and the other will be about Google Glass. In addition, I am co-moderating the hardware night on September 2 and we’ll have a look at many IoT devices.

Let's get funky!

Lights. Colours. Changing colours. Flashing lights. We could just as well open a dance club! But then some genius suggested we should add some analytics…  I mean, honestly…

That slightly spoilt our groovy party mood (and the dancing), but gave us a great idea. We call it “Funky Retail” and it’s coming soon! Sven is soldering day and night to get everything finished. There’ll be distance sensors, pressure sensors and LED’s. Can you guess what we’re up to? Two more clues for you: it will be an in-store prototype and involves a big screen. What would you make out of all of this? If you’ve got an idea send an email to labs@hybris.com. The best idea will receive a reward, consisting of us stealing the concept, giving it a new name, and then writing about it here in our blog. What an honour! We’ll also let you know once we’ve got a patent on it.

FUNKY RETAIL IS COMING SOON! CLICK HERE FOR AN UPDATE.

IMG_20140716_115553

Don’t touch this!

Moving objects with the power of your thoughts… No, we haven’t mastered telekinesis yet, but we’re getting close. Our In-Store Display senses when you’re there, knows who you are and reacts to you hand movement. Spooky! Actually… it’s fairly simple.

We use a distance sensor to detect the presence of a potential user. A Beacon recognises if this user is logged onto Facebook. And at the heart of this prototype a Leap Motion Controller enables the gesture-based input.

Put into action, a customer can browse through a store’s entire assortment with simple finger movements. This is a playful way to let customers retrieve product information and the location of the desired product within the store.

14 - 1

Coupon Boxes – small, simple & orange

We’ve had them for a while and have been demoing them regularly at events, but never got round to present them here. That’s not really fair though. Okay, they’re probably not as controversial as Google Glass and not as magical as Stream… but they’re orange, they flash (a bit) and we like ‘em.

Right, let’s be a bit more serious. Our Coupon Boxes are a great way of handing out vouchers to customers. They can be placed basically anywhere. The customer just needs to tap the box with his phone to receive the coupon, a signed URL, which is transmitted via NFC. Adopting the coupon to gender is also possible, if a facebook profile is detected.

So where’s the extra benefit in handing out vouchers this way? Well, for a start it can be interesting to know where a coupon was picked up. Taking a more active approach, Coupon boxes can be used to guide customers to specific places, adding an element of gamification to the shopping experience.

Again, both sides can profit: customers receive a voucher, retailers gain a bit of information and everybody’s happy.

DSC_1183