Serverless Cloud Maker

Year: 2018
Client: Google Cloud

Key cloud creation interfaces



Large LED screen visual design



View of the cannon with the large LED screen where the clouds are launched



Attendees created their own clouds at the creation station and launched their clouds with the cannon in the background




3D Sketch layout of the installation






Overview
Cloud Maker allowed users to create thousands of apps and experience how easy it is to use Cloud Functions on the spot. Attendees at Google Cloud Next were able to launch their own apps and watched them apply filters to photos.
Role
Design lead
Art Direction
UX design





Impact
  • 11,000+ Attendees
  • 25+ custom installations in various scales
  • 300+ Googlers interacting with customers and partners
  • 3,500+ total Clouds created
  • 1250+ daily experience engagements
  • 2100+ leads captured





Context & ChallengeThe experience was featured at Google Cloud Next, a three day conference bringing people in the cloud computing industry together. We were brought on to create a fun and engaging demo that allowed the attendees to experience Cloud Functions hands on.  

So what is Cloud Functions? Think of it as kind of like IFTTT (If This Then That). It’s essentially a system of functions that link developers’ snippets of codes to plug into  Google Cloud service to achieve a task. To utilize Cloud Functions, we had to literally create a simple app that allowed the users to select the fucntions.



Key Target UsersWe were able to discern attendee profiles from event registration data. Once we recognized the types of users that will be coming to the event, we were able to set the right level of interactivity for the installations.



30%

Developers

30%

IT + Business
Decision Makers

20%

CEO/CTO

20%

Others


We have identified the developers and the IT/Business decision makers as the main users of the interactives. They are the ones that will be hands on the job so we had to make sure there was enough substance for them to see under the hood and poke around all the buttons and dials.
However, that doesn’t mean we should count out the C Suite audience. For the CEO/CTOs that are less likely hands on, we had to make the experience easily followable from a distance.




The ProcessWe needed to create an app that was simple to understand and easy to see the input and the output. What better way to show an app working than showing a photo being manipulated by a filter? So we set out to create an app using Cloud Functions that filters photos. While we were at it, we decided to use the metaphor of clouds as visual representations of the apps.

In order to experience the full ecosystem, we had the users to:
1. Build the app (cloud)
2. Launch the app
3. Let it function by sending load (photos)



Discovery

Initial discovery kick off with the stakeholders netted out a solid concept. The stakeholders in the room and myself wanted the attendees to experience the app itself. The concept was written up with reference images to be shared with the stakeholders.

An experience diagram roughly sketched out by the stakeholders were used as a starting point to hash out the key touch points.



In order to make the visual concept fit with the product, I created simple diagrams and storyboards to go back and forth with the stakeholders. The above diagrams were used to establish the behavior of input/ output data and the cloud’s (application) function. 


We wanted the attendees to create applications that are very open and can perform many tasks. The Creative Technologist and I brainstormed various functions, input data points, and output results. 

The experience consisted of creating an application and testing it. In order to test the application functioning, users were able to send requests (load) to the applications by triggering picture balloons to come out with a pump adjacent to the cannon.Through the discovery phase, the stakeholder and I were able to nail down the behaviour of the cloud/application, it’s input/output data, and the following flow after the user launched the cloud. 







User Flow






UX Studies
To deliver a truly custom experience, I initially envisioned the users to be able to digitally draw their own clouds. They’d be able to customize the cloud’s features using the options on the left panels. However, due to the complications of live tracing custom inputs, we had to resort to templatizing the clouds.

The pipeline builder where users could create their own functions started out as a simple drop down menu UX. The intended experience was for any one to build their functions without having to write the code.

Initial UX study of load tester had four different types of requests with four distinct physical pumps. Users would be able to see how their clouds processed the requests on a log powered by Firebase.


Early storyboard of the LED screen is shown. The four distinct requests were to be represented by different types of birds and balloons. New Clouds would always appear at the center and a modal showing the cloud processing requests appeared.



Refinement
A set of 18 Cloud templates were created. Each cloud had an expression with an accessory. The templates were spec’d out to show where each of the elements should be placed. 

Set of expressions and accessories were created to allow the users to customize their clouds.

Cloud colors were specified as well.




The LED screen background has also updated with ambient clouds and a slight gradient. The gradient of the sky was programmed to reflect the actual time of the day.


Conclusion
We wanted to make the experience really hands-on. Each step of the way were filled with block coding interfaces, a physical button, a smoke cannon, and a pump. The attendees literally created their apps, launch it, see it live in a large LED screen, watch it process photos, and be alerted that their app has done its function.


Introduction/Onboarding
Select your pipeline or create your own
Customize and edit your functions
Customize your cloud

Cloud completion confirmation

Cloud completion. Users are directed to go to the launch cannon station.











Users first search for their clouds to launch at the station
A big red physical button was next to the tablet



Once the button was pressed, the cannon shot a smoke ring to the LED screen
and the cloud appeared on the LED screen


Once the cloud was launched, users were prompted to use the pump to inject requests and test the application


Users use the pump to send image requests to test the application








Users were able to check back on their clouds to see how many events they have processed. From the app, users were able to send random requests to other clouds to randomly process the image.







User’s selfie went through an emojify function cloud














ReflectionLanding on a mutual understanding of the client’s product is always a challenge. It’s critical that communication is open and that there’s a sense of collaboration. When it comes to creating a demo of a program, block based coding always works great. Allowing the users to customize critical touch points suddenly places authorship to the user. The mix of physical and digital interactions are always intriguing, although I’m not sure if it can continue on in the short term due to COVID. There must be new methods of integrating the digital with the physical.




Full Team
Visual Design: Bryan Park
UI Design: Ryan Greenhalgh
Technologist: Marcus Guttenplan