Fashion meets artificial intelligence: is your privacy at risk in virtual fitting rooms?

The retail fashion industry is once again embracing the merits of artificial intelligence – but this time it is bringing the technology to its fitting rooms. Artificial intelligence, along with virtual and augmented reality technology, is being harnessed to tackle the unprecedented issue of contactless stores as a result of the ongoing pandemic.

‘Fit technology’ is by no means a new area and companies have been trying to perfect the art of online fittings for years now, mainly focusing on enhancing the e-commerce experience. However, due to safety measures that have been rolled out in most brick and mortar stores around the globe, this technology is now being looked at to rapidly replace the fitting room experience that was once the main selling point of in-person stores.

Such technology is not without privacy concerns. Our TechGirl, Charlotte Gerrish (assisted by Komal Shemar), has worked with AI and augmented reality technology companies and want to share our insights into reconciling cutting-edge algorithms with the right to privacy with the TechGirl community.

The current situation

Whilst most fitting rooms in fashion retail stores are closed due to the risk of creating contamination sites, recent studies show that even if such amenities were available, consumers no longer feel comfortable using them. According to a white paper by First Insight, more than half of all women and men in the US no longer feel safe trying on clothes in fitting rooms, at 65% and 54% respectively, with a staggering 78% of females no longer feeling safe when testing beauty products.

This means that the fashion and make-up retail industry could take a significant hit as consumers will now either refrain from buying items due to the uncertainty surrounding fit, or will buy products to try on in the comfort of their own home, which could add to the rate of returns that these retailers see. According to Vogue Business, fit is said to be the leading reason for fashion e-commerce returns, explaining the accelerated shift towards tech-driven fitting rooms now.

How does the technology work?

For such virtual fit technologies to work, precise measurements and personal preferences are required – which means handing over sensitive data in the hopes that those pair of jeans you have been eyeing up all week might fit.

For example, one leading company in this space uses artificial intelligence, with its customers providing data on their height, weight and fit preferences, as well as 3D cameras capturing 150 data points on their body in the space of 10 seconds – all through the medium of an app on their mobile phone. This algorithm then combines this information with its existing database of items, styles and sizes to recommend products from its brand partners.

Another company takes this same data and is planning on combining it with the ability to scan QR codes in retail stores to have personalised recommendations sent to your profile on their app. Similarly, 3D scanning technology is working towards allowing tailors to take measurements without having to come into contact with their client, as well as creating online fitting rooms where you can style clothes on yourself via your mobile or laptop.

What are the privacy concerns?

In using such online fitting rooms, consumers will be handing over their personal data to the providers of this technology, as well as their suppliers and brand partners (the retailers). The significant point here is that such personal data is not limited to usual contact data such as name or email address but will include sensitive biometric data.

Sensitive data is accorded extra protection under the General Data Protection Regulation, Europe’s leading law on the protection of your personal information. As such, companies processing such personal data will require a lawful basis to rely on under Article 6 of the GDPR, which applies to the processing of all personal data, but in addition, they will also require a lawful basis under Article 9 of the GDPR. For such commercial processing of sensitive data, consent will most likely be the appropriate ground to rely on under Article 9.

However, even if you do provide consent to such processing, you need to fully understand what you are consenting to. For example, could that sensitive data be used for e-marketing? Could it be sold and used by AdTech companies? Will it be shared with only one retailer or with a consortium of retailers? Could your data be used in existing trend forecasting AI tools? These questions should all be covered by the data controller’s privacy policy – which should be as transparent and open to accountability as possible. As sizing can differ across brands and retailers, this process might require handing over such data to numerous data controllers and processors.

Furthermore, biometric data in the form of 3D scans of your face and body could also reveal sensitive data such as sex, racial or ethnic origins, and potential or existing health issues. This is significant as data bias is a huge issue with AI and automated technology. A famous example includes Amazon’s now discontinued AI recruitment algorithm which favoured females for HR roles and rejected them for more technical roles. This was due to the bias that was integrated into the algorithm as the system was trained using internal records from the last decade.

Therefore, the data used to train these algorithms will be important. Arguably, any automated decision making through these algorithms should not have any significant legal or similar effects on data subjects as described in Article 22 of the GDPR; as in the worst-case scenario, data subjects will just be recommended products that do not actually fit. Nonetheless, the processing of sensitive data must be proportional to the reasons behind the processing, with such companies imposing adequate technical and organisational measures to mitigate any risks posed to the rights of data subjects under the GDPR.

Such measures can include anonymisation of personal data so that the data subject can no longer be identified or pseudonymisation of personal data whereby that data cannot be used to identify a data subject without additional information or a key, which is kept separately and securely, for example through encryption of data. However, such measures can be difficult to apply when biometric data is concerned, especially if facial images are being used in an algorithm. Similarly, the retention of such data can be an issue if a data subject was to retract their consent to the processing. Data controllers must be able to extract and delete one individual’s data from a data set at any given time – which is easier said than done when it comes to machine learning technology.

Conclusion

If companies rolling out such technology successfully implement privacy by design, remove data bias from their algorithms, as well as implement secure technical and organisational measures, then this could be a massive opportunity in this industry – and could revolutionise the way we experience retail fashion.

However, to get this right, privacy must be at the forefront of all stakeholders’ minds, including the data subjects that will be handing over such sensitive data.

If you have any questions relating to privacy, artificial intelligence, or automated technologies, please do not hesitate to contact us! Likewise, let us know what you think about the future of virtual fitting rooms!

More Stories
beautytech at CES 2017
The best beautytech at CES 2017

We would like to serve you some cookies

We want you to experience our website at its best. Therefore we would like to serve you some cookies. You can read more about it in our Privacy & Cookie statement.

Privacy policy | Close
Settings