|
META TOPICPARENT | name="SecondEssay" |
|
|
< < | For Lawyers is Alexa Safe for Work? |
> > | Alexa isn't safe for work for lawyers,so why do lawyers still use her? |
| -- By AyeletBentley - 03 Dec 2019 |
|
< < | Legal ramifications of Alexa have been discussed in relation to Alexa recordings as evidence in court. Prosecutors have requested recordings. Amazon claims they won’t turn over recordings, but ultimately they have. This paper, however, discusses a different legal issue: can lawyers ethically have Alexa?
Many people are momentarily disturbed when they realize the breadth of Alexa privacy concerns, but they find asking Alexa what time it is worth the privacy loss. However, for lawyers and law firms bound by Rules of Professional Responsibility, this lax view is not an option.
Case.one and Thomas Reuters have designed Alexa apps to assist law firms with billing and case look up respectively. Before use lawyers should consider how Alexa uses information and whether it is compatible with the Rules of Professional Responsibility. If lawyers choose to have Alexa they should consider ways to improve security to not violate these rules. This paper begins by going through how Alexa captures and uses data. It then looks at how this data use interacts with Rules of Professional Responsibility. Finally, it offers some potential solutions. |
> > | Legal ramifications of Alexa have been discussed in relation to Alexa recordings as evidence in court. This paper, however, discusses a different legal issue: can lawyers ethically have Alexa? After deciding lawyers cannot it asks, why do lawyers still use them? And, what liability do companies making these devices have for not warning? |
| |
|
> > | Many are momentarily disturbed when they realize the breadth of Alexa privacy concerns, but they find asking Alexa what time it is worth the privacy loss. However, for lawyers and law firms bound by Rules of Professional Responsibility, this lax view is not an option. Case.one and Thomas Reuters have designed Alexa apps to assist law firms with billing and case look up respectively, showing Alexa is not only used by lawyers but used for law work. This paper begins by going through how Alexa captures and uses data. It then looks at how this data use interacts with Rules of Professional Responsibility. Then it looks at why lawyers still use them and whether the companies can be responsible for this use. |
| ALEXA'S PRIVACY CONCERNS |
|
< < | When Alexa listens, the clips she hears are not destroyed after she responds, but are stored indefinitely (or until someone deletes them). One can go through and listen to every conversation they have had with Alexa on their account. A clear issue apparent when doing this is that the conversations are not necessarily preceded by a “wake word” like Amazon claims. Further, Alexa automatically records the few seconds preceding the “wake word.”
Another major privacy concern is that people are actually listening to these recordings. Amazon itself has admitted to this. Amazon claims that they take privacy seriously and only a small number of conversations from a random set of customers are listened to. However, any conversation could be listened to and even if the specific one isn’t, Amazon has the ability to do so. Any expectation of privacy is lost.
PROFESSIONAL RESPONSIBILITY
Some commentators have argued that lawyers cannot use Alexa while others have discussed issues but ultimately decided they can. This paper seeks to more thoroughly analyze the issues. A lawyer’s confidentiality duty is covered by Rule 1.6. Rule 1.6 states lawyers are not to reveal information related to the representation of the client unless 1) the client gives informed consent, 2) the disclosure is implied in order to carry out duties of representation, or 3) a number of exceptions including protecting the client or third parties from death or if the client is about to commit a crime. For this analysis number three is outside the scope since it is case specific. However, the following paragraphs will look at the ways Alexa can breach this duty of confidentiality as well as the first two allowances for disclosure. |
> > | It seems obvious that a listening/spy device that sends conversations to Amazon is prohibited for lawyers. Yet, lawyers still use them. When Alexa listens, the clips she records are not destroyed after she responds, but are stored indefinitely (or until someone deletes them). One can listen to every conversation they have had with Alexa on their account. A clear issue apparent when doing this is that the conversations are not necessarily preceded by a “wake word” as Amazon claims. Further, Alexa automatically records the few seconds preceding the “wake word.” |
| |
|
< < | Speaking to Alexa likely counts as third party disclosure and thus for lawyers to ethically have an Alexa this disclosure should be covered under one of the subsections of Rule 1.6. It has been established that humans are listening to Alexa. While there are potential legal problems with Alexa recording and saving conversations without client permission, the listening puts Alexa in the category of third party disclosure. Since Alexa records and sends snippets of conversations not preceded by a wake word this is a particular concern. A lawyer or law firm cannot have Alexa and just avoid saying her name when discussing client matters. |
> > | Another serious professional responsibility concern is that Amazon employees listen to these recordings. Amazon itself has admitted to this. Amazon claims they take privacy seriously and only a small fraction of conversations from a random set of customers are listened to. However, any conversation could be listened to and even if the specific one isn’t, Amazon could listen to it. Any expectation of privacy is lost. |
| |
|
< < | Thus, for lawyers to have Alexa her use would have to be covered under the third party disclosure allowances of Rule 1.6. Perhaps lawyers could receive informed consent putting Alexa into an information disclosure form that their client signs. Besides the face value absurdity of this, this could run into a problem because 1) the client does not know specifically what matters would be shared with Alexa and 2) neither the lawyer or client knows who will actually be listening. The first factor is an issue because generally clients do not grant blanket disclosure consent but instead grant consent for the discussion of something specific. As to the second, “Amazon employees” feels far too broad to put in an informed consent agreement. |
> > | A lawyer’s confidentiality duty is covered by Rule 1.6. Rule 1.6 states lawyers are not to reveal information related to the representation of the client unless 1) the client gives informed consent, 2) the disclosure is implied in order to carry out duties of representation, or 3) a number of exceptions including protecting the client or third parties from death or if the client is about to commit a crime. For this analysis number three is outside the scope since it is case-specific. |
| |
|
< < | An even more troubling argument would be that the disclosure is implied to carry out duties. One could argue that as long as the client sees Alexa when walking into the law office, there is implied consent. This, however, would be a bad argument. It doesn’t cover home Alexas for when lawyers work from home, assumes a knowledge that actual humans are listening that shouldn’t be assumed, and opens up a slippery slope of assumptions. |
> > | However, this paragraph analyzes how Alexa breaches a lawyer’s duty to confidentiality and why Alexa is not covered by the exceptions. Speaking to Alexa is likely third party disclosure. Humans listen to Alexa. Since Alexa records and sends snippets of conversations not preceded by a wake word this is particularly concerning. A lawyer or law firm cannot just avoid saying “Alexa” when discussing client matters. Thus, for lawyers to have Alexa her use would have to be covered by third-party disclosure allowances of Rule 1.6. Perhaps lawyers could receive informed consent putting Alexa into an information disclosure form that their client signs. Besides the face value absurdity of this, this could run into a problem because 1) the client does not know specifically what matters would be shared with Alexa and 2) neither the lawyer nor client knows who will actually be listening. An even more troubling argument would be that the disclosure is implied to carry out duties. One could argue that as long as the client sees Alexa when walking into the law office, there is implied consent. This, however, would be a bad argument. It doesn’t cover home Alexas for when lawyers work from home, assumes a knowledge by clients that actual humans are listening, and opens up a slippery slope of assumptions. |
| |
|
> > | WHY DO LAWYERS IGNORE THIS? |
| |
|
< < | WHAT SHOULD LAWYERS DO? |
> > | If lawyers are aware that Alexa listens and that this should violate legal ethics, why are they determined not to perceive this? Probably most often they choose to ignore it in the same way as anyone else aware of the privacy risks who appreciates asking Alexa the weather. Further, Case.one and Thomas Reuters’ apps for Alexa make Alexa seem not only allowed but almost condoned for lawyers since the Apps are made just for their jobs. |
| |
|
< < | This paper argues Alexa violates Rules of Professional Responsibility for lawyers, so what should lawyers do about that Alexa in their law office or home office? The most obvious answer is not have Alexa or unplug it. The trade off of violating Rules of Professional Responsibility versus not being able to track billable hours with Alexa seems like a no brainer. Alternatively, one can turn off their recordings for development purposes which theoretically should prevent their voice from being sent to Amazon. However, a distrust of Amazon would prevent one from relying on this. Using the old Amazon Tap would have helped since it required pushing a button to activate the speaker, but Amazon doesn’t make this model anymore. Others have suggested keeping Alexa in places where confidential matters are not generally discussed, turning off when discussing private matters, and deleting old conversations. Another potential solution is Project Alias which creates white noise that is stopped only by a wake word. |
> > | Further, perhaps since the ABA has not released anything saying it is prohibited, people assume it is allowed. Most writing about it discusses the issues but does not say it is prohibited, only that lawyers should take caution. Since lawyers have lawyer friends who use them, there is an assumption that perhaps nothing is wrong with it. This is a topic the ABA should speak up on and likely one professional responsibility classes should discuss. |
| |
|
< < | Though there are potential remedies that may allow Alexa’s use by lawyers, they are not sufficient that one should feel comfortable. The potential for violating Rules of Professional Responsibility is real and the benefit of Alexa minor. Prosecutors might see Alexa as their friend when she is present at the scene of a crime, but no lawyer should see Alexa as a friend in their own office. |
> > | ARE AMAZON AND GOOGLE LIABLE? |
| |
|
< < |
I'm surprised you think it takes this much analysis to prove that putting a listening device controlled by a third party inside a law office is unacceptable. If I recorded conversations with or about clients and gave the tapes to third parties no one would doubt that constituted waiver of privilege and breach of the duty of confidentiality. Why would the conclusion differ because the microphone had a name and called itself a speaker? |
> > | Finally, could Amazon, Google, and others be held responsible for not warning professionals that device use may violate professional ethics? Probably not. Companies can be held liable for not warning when the product is dangerous the danger should be or is known by the manufacturer, but not the consumer, and the defect occurs when the product is used in the normal way. The dangers of using Alexa and violating professional ethics are different than the physical dangers covered in danger to warn. |
| |
|
< < | So it seems to me you can use the bulk of your space on two other questions: |
> > | Further, manufacturers may not be required to warn when dealing with sophisticated users. A lawyer, who herself should know the laws of professional ethics, may be a sophisticated user. However, she may not be sophisticated as to the technical issues of Alexa’s privacy. That said, Alexa’s terms of use do discuss that Alexa records and stores conversations. If anyone is a sophisticated user as to reading terms of use, lawyers should be the ones. Perhaps other professionals, less sophisticated in terms of use, such as doctors should lead an action for lack of duty to warn. There is a class-action lawsuit against Alexa for allegedly illegally recording children’s voices. However, children who have not consented are different than professionals who choose not to read the terms of use. |
| |
|
< < |
- Why are so many lawyers determined not to perceive the obvious?
- Are Amazon, Google, and similar platforms potentially liable for not warning professionals (including but not limited to lawyers) that use of their devices in some contexts may violate professional duties and local laws?
|
| |
|
< < | |
| |