I have a great Fitbit story that really kind of talks about and gives you an idea of what the risks are. There was a study done, I believe it was in North Carolina, in a geriatric facility about activity. And they gave people something that is pretty much like a Fitbit. It monitored their activity, the number of steps that they took. And the study was going to sit, look at, okay, we'll record all this data and see how it impacts patients' health. So they collected all this data and then they started to look at it. And what they found, they found some weird patterns. It seems that very late at night, right before bed time, a lot of patients were having five to ten minutes of really strenuous activity. And they were like, this is strange. I don't know what exactly is happening here. And then they thought about it and went, no. We've discovered that we're tracking something that we don't want to track. So we collect all this data and we don't exactly know what the implication, what box we're opening when we get all that data. We go into it thinking, hey, I can figure out things that are really powerful and really helpful. But we also don't know what else we're going to find with that data. Another example is that if you publish your Fitbit activity and you go to the gym everyday at 8 AM. And let's say I am an evil person, I know that you are not home at 8 AM because you go to the gym everyday. That provides a criminal opportunity that might take advantage of, where a person that doesn't use a Fitbit wouldn't have that. So we don't think about these unintentional things, we don't think about the unintentional data gathering. But even though were obfuscating and we don't know which patients they are, we could find out a lot of information that isn't necessarily good or bad, but can buy the privacy in different ways.