Transparency and Control Are Central To Our Approach

August 31st, 2018 in

As AI assistants that aid conversations rise, we need to set standards on transparency and control for all conversation participants.  Some people believe there is a fair trade to be made between the privacy of all participants and the utility of some.  We don’t.  We believe privacy and the right to transparency and control are fundamental rights so we have designed our enterprise AI “Eva” that way.

Over-index on the side of being transparent

In short, Michael Cohen and Omarosa would hate our approach.  When Eva is scheduled to join a call, Eva does a lot to make sure all participants are fully aware that the meeting is being preserved.

  • Eva emails the participants letting them know that Eva will be there to record and take notes.
  • When the participants join the call – Eva announces saying that Eva is recording and taking notes.
  • When video is enabled for conference calls, Eva projects a presence into the webcam.
  • When chat is enabled for conference participants, Eva posts a message about Eva’s activities.
  • When the meeting is over, Eva emails the internal attendees letting them know ALL of them can edit or delete the notes (for free).

This kind of radical transparency is much better for long-term trust.  We frequently get requests to reduce the transparency because some competitors (who shall remain unnamed) feel that reduced transparency is acceptable.  We disagree because we believe that this behavior gives the entire solution space a bad name.  Think of how pop-ups in the early 2000’s hurt the entire ad ecosystem.

In addition to radical transparency, we also believe in control.  Anyone who is a proven attendee of a meeting (i.e. they were on the calendar invite or explicitly added as a collaborator) has the right to control when Eva is recording and even delete the underlying audio for EVERYONE.  This seems like an extreme position, but when you think of the perspective of attendees we believe it’s the best privacy protection feature.  If someone leaves a meeting with some discomfort from the recording they should be able to delete it.  Because we make such deletions transparent, it provides a counter-balance so the attendees will know that if the meeting was deleted and who did so.

We are currently working on a number of new features that will give the attendees even more control as Eva becomes more ubiquitous in meetings at work.    [In summary, the ecosystem would benefit from having basic privacy standards regarding transparency and control around how AI participates (and records) conversations.   We believe that building trust requires a minimum amount of transparency and control.  We would love to hear your thoughts on this topic.

 

Tagged

Related stories