EVINCE

We like it or not, our digital selves has come to be our second skin, competing with the already existing layer of expression— our clothing.

Probably, we walk with two skins now; one defined by our virtual self, the other defined by our clothes.

 

What if these two external expressions be linked?

What if your garments could reflect your current emotional state in realtime; by analysing what you are posting on social media?

 

What if you evinced your state of mind as you walk the talk?

 

 

 

The project explores the design of garments that can change their form in response to emotions expressed online on social media platforms in real-time.

THE PROCESS:

 

Step 1 -  The first step is to identify the Emotion the user is feeling based on what they are posting on their Social Media account specifically Facebook. 

 

Step 2 - Data mining & Text mining algorithms to pick up the relevant dataset (latest text post) as well as Data scrubbing algorithms were created these are created in the R-studio environment.

 

Step 3 - I created a Sentiment Analysis Algorithm based on the “Naives Bayes Model” for which I created a dictionary of over 14,000 words that associates each of the words to Robert Plutchik’s 8 Primary Emotions (acceptance (or trust), joy, surprise, anger, fear, disgust, sadness and anticipation)

 

For each word in the database, associated primary emotions are indicated by assigning a 1 to that primary emotion (primary emotions that are not associated are assigned a 0).

 

The overall valence of the word is also indicated with a 1 or 0 (representing positive or

negative respectively). 

 

Step 4 - Thus I am able to identify the main emotion the user is feeling after analysis of their Facebook post.

After this various elements on the garment mapped to the different emotions change their form in response to the emotion that has been identified.

 

Step 5 - Actuators created using “Nitinol” (a Shape Memory Alloy) , helps control the movement of the various elements of the garments based on the input Emotions.

 

The garment uses custom created electronics, an Arduino(micro-controller), the R-studio platform for Data-mining & Sentiment Analysis, the Processing environment for interfacing with the Arduino.

THE FINAL GARMENT

THE concept video

THE DESIGN PROCESS

The App design

EXHIBITIONS & TALKS

• This project was the only INDIAN project selected to be exhibited at the "INTERSECTIONS EXHIBITION" held  in Loughborough University, London on 13-14 September 2017. The exhibition showcased the work of over 30 collaborations from leading designers, practitioners, artists, artisans, academics, scientists and engineers from the world-over.

 I was also a Speaker at the "ALL- INDIA SMART TEXTILES CONFERENCE" where I presented this project and spoke about the use of new technologies in the field of smart textiles & the integration of the shape memory alloy "NITINOL" in textiles during the process of weaving.

 

© 2018 by Paridhi Diwan.