There was a stack of robots at SXSW this year. Over just four days, I managed to watch robots play drums, print names on table tennis balls (Royals logo looked great btw), hear a debate between two robots on the virtues of Lady Gaga vs. Taylor Swift and even go to an ‘artificial (comedy) improv’ session featuring a cast of robots dressed up in drag! All these androids ‘taking over the world’ are bound to make some agency folk more than a little paranoid about the future of work. But others are embracing the fear and running headlong into a cybernetic future where computers understand more of our intuitive human knowledge and – if we’re interested enough – may actually help improve one of the key things that makes us human in the first place: that is, our ability to emotionally connect with others.
This was the premise behind a session by Jason Robinson, Senior Design Lead at IDEO and Pip Mothersill entitled ‘Humans, Machines, and the future of Industrial Design’. The pair are best known for their work in the fusion of technology and design, in particular the creation of tools to help people design objects that can express higher-level sentiments and emotions via aesthetic forms.
So how might this work exactly? Well, whether or not we’re experts in design, people subconsciously perceive meaning in objects through their physical design ‘language’ (their form, colour, materials etc.). This is well represented by the famous teaching technique used in the Disney studios that uses a simple Flour sack to convey different kinds of emotion, below:
Designers intuitively understand this language and actively translate it into meaning that’s inherent in physical geometries, often using complex computer aided design tools (CADs) to create 3D models of their designs. But, what if we could decode the physical design language so that CAD systems could use words instead of numbers to more intuitively create expressive designs?
At the MIT Media Lab, Pip created an emotive form design taxonomy that broke down people’s emotive perceptions of different shapes into various quantitative design attributes – attributes that were then directly integrated into the design software itself. When a designer typed a word into the emotive modeler, the system analysed the emotional associations to the word and generated a 3D model who’s forms reflect the emotive character of the words themselves. In theory, the tool’s valuable as a starting point for novices who don’t know where to begin but also for more expert professionals who want to quickly come up with a whole range of emotive designs from which to build and inspire their final creations.
You can check out the emotive modeler here; emotivemodeler.media.mit.edu
All this got me thinking, if this type of work’s already being done to more effectively communicate a professional’s thoughts through the medium of design, surely it’s only a matter of time until someone figures out how to do the same thing through the medium of advertising?!
Creative search tools such as Yossarian are already being used by the creative community to offer fresh perspectives, provocations and feedback at a global scale. At the same time, AI powered design platforms such as www.thegrid.io (while far from perfect) profess to ‘craft beautiful websites driven by human-centered values, constraints and direction’. I can’t claim to know what the future of the broader communications industry and AI platforms might look like exactly, but one thing’s for sure: those who embrace, develop and leverage this merger of being and machine to help improve creative product will be the winners in a world that’s barreling towards a cybernetic future.
Also: if you’re heading to Vivid Sydney or are around on June, you might be interested in this event. There are a couple of Royals involved: