GDPR (General Data Protection Regulation, Wikipedia) was a big topic last year. I saw some people ask questions about how to make their Alexa Skills and Google Actions GDPR conform, but there are no clear answers.
Has anyone here built GDPR compliance into their voice apps? What did you do?
GDPR requires a few users’ rights (learn more here). Some of them are easy to solve by adding some additional intents and logic, some are a little tricky:
- Right to Access Personal Data: Users should be able to get all the data collected about them. I think this can be a challenge because of anonymized user IDs, so this could only be “triggered” (if no account linking is used) by a dedicated intent. Difficult to return saved data in a voice response though. Any thoughts?
- Right to Rectification: Users should be able to have their data modified. If the user’s ID is known (see above), this shouldn’t be a problem.
Right to Erasure: All data should be able to be deleted. Shouldn’t be a problem with a
Right to Restrict Data Processing: Users should be able to stop processing of their personal data. I think this can be solved by adding a flag
isUserDataProcessingAllowed. This would keep the data in the database, but would treat the user as a new user with no stored data.
- Right to Data Portability: A user should be able to request that their personal data is sent to a third party. Shouldn’t be a problem, as user data in voice apps is mostly structured and available in one database.
- Right to Object: Solution -> don’t reject data processing requests.
- Right to Reject Automated Individual Decision-Making: Similar to Right to Restrict Data Processing in my opinion.
The question here is also: What counts as personal data? As the voice platforms already anonymize the data to app-scoped user IDs, there is not a lot of Amazon/Google account information that an app can get (without account linking). Does
has used the 3 times this week count as personal data?
Interested in hearing your thoughts!