It’s 4:00pm on a Friday. You get a text message from a colleague saying they can’t log into their email. Then others start texting you with a similar message. Your IT person reports that there’s unusual activity on your computer network and they’re having trouble remoting into the system. An hour or so later, you get the weekend-ruining-news: your systems have been hacked and the hackers are demanding ransom in Bitcoin to get your files back.

What you do next largely depends on the nature of your business, whether you have recent backups of your critical files, and whether you have cybersecurity insurance (which almost all businesses should). The specifics of a proper ransomware response are outside of the scope of this article and will vary widely depending on the circumstances of each attack. Here, we focus on the single critical question: should you (or your insurer) pay the hackers the ransom they are demanding?

Unsurprisingly, the answer is, “It depends.” Most likely, your business is not in the financial position to pay the six or seven-figure sums demanded by the hackers, so you rely upon your insurer for advice and guidance. Naturally, you probably would be very much inclined to tell your insurer to “do whatever it takes” to get access to your systems back. But, it’s not that simple, especially in light of recent guidance from the United States Department of the Treasury.

That guidance warns businesses who have been victimized by ransomware attacks to carefully consider whether they or their insurers should pay ransom to hackers. Putting aside the obvious ethical issues associated with continuing to fund bad actors who do bad things, the government wants businesses and insurers to know that directly or indirectly facilitating payments to hackers may violate federal law and regulations if it turns out that the ransomware payments were made to groups or individuals on the government’s “Specially Designated Nationals and Blocked Persons List.” In other words, if the government subsequently learns that you or your insurer made a payment to a person or entity on this list, you may face legal consequences even if you did not know that the recipient of the payment was on the government’s list. (Helpfully, however, the government notes that if you promptly consulted with law enforcement before making any ransomware payment, the government will consider that consultation with law enforcement a mitigating factor in your favor.)

Does this mean that your insurer should never pay ransom? No, because again, the complexities associated with that question vary widely on the facts and circumstances of each ransomware attack. If a business believes it may have experienced or is experiencing a ransomware attack, it should promptly contact their insurer, a qualified cybersecurity expert, competent legal counsel, and perhaps law enforcement to determine the best path forward.

In any cybersecurity incident, a business is often faced with lots of really bad options. They key is to pick the approach that allows the business to continue to operate while minimizing any potential legal fallout as much as possible. This may or may not include paying the ransom demanded. If you have questions and would like further information or a review of your organization’s cybersecurity policy, please contact PLDO Partner Brian J. Lamoureux at 401-824-5155 or email bjl@pldolaw.com. Attorney Lamoureux is a member of the firm’s litigation, employment, and cybersecurity teams.

 

 

Disclaimer: This blog post is for informational purposes only. This blog is not legal advice and you should not use or rely on it as such. By reading this blog or our website, no attorney-client relationship is created. We do not provide legal advice to anyone except clients of the firm who have formally engaged us in writing to do so. This blog post may be considered attorney advertising in certain jurisdictions. The jurisdictions in which we practice license lawyers in the general practice of law, but do not license or certify any lawyer as an expert or specialist in any field of practice.

America is grappling with a compelling question during the coronavirus pandemic: should we allow Big Tech and the government to use our smartphones to track our movements, body temperature, and activities to slow the infection rate?

As explained by the wonderful documentary Terms and Conditions May Apply, history provides some guidance here. Twenty years ago, a little-known web-based toy company called Toysmart.com filed bankruptcy. Toysmart proposed to sell its customer list as part of its bankruptcy process. This request implicated serious privacy concerns as Toysmart had promised its customers that it would never sell or disclose their information. Understandably, however, Toysmart’s creditors viewed the customer list as a valuable asset that should be sold to satisfy their claims.

After much public outcry (including an enforcement action by the Federal Trade Commission against Toysmart), Toysmart was allowed to sell its customer list. This resulted in tremendous outrage among privacy advocates, and spurred Congress into action. Congress immediately proposed a comprehensive online privacy bill called “The Online Privacy Protection Act of 2001.” This bill would have made it illegal for a website to “collect, use, or disclose personal information concerning” individuals in violation of rules to be set by the Federal Trade Commission. It would have also required websites to keep personal information confidential and secure.

But, before this bill could pass, something tragic happened: 9/11. Congress quickly realized that having private companies such as websites gathering lots of information about users might be a critically important tool in the newly-formed fight against terrorism. Unsurprisingly, the privacy bill went nowhere and instead Congress passed the Patriot Act, which expanded the government’s ability to obtain data in the war against terrorism.

This policy shift had three enormous impacts. First, it paved the way for social media and tech companies to create platforms based on gathering, using, and selling consumer information virtually free of regulation. Had the 2001 privacy act passed, those business models might have been unworkable due to the forthcoming FTC restrictions. Would Facebook have even been able to exist in its current form with such strong privacy regulation? Likely not.

Second, Americans showed that they were mostly comfortable trading certain liberties for safety (or at least in appearance). Desperate times understandably called for desperate measures. However, as the last twenty years have shown, perhaps it was not such a great idea from a privacy perspective to allow Big Tech and other data-driven companies to have the virtually unrestrained ability to gather, manipulate, use, sell, and monetize our personal data.

Third, we now see some serious consequences of this policy shift. It’s one thing to allow Big Tech and the government to access the movies we like, sports teams we follow, and friends we have on Facebook. But, now that the coronavirus pandemic is here, public health advocates seek to use Big Tech’s tools to access our current body temperature, movements, places we’ve visited, and people we sat next to on the subway or at Subway. Sure, these advocates correctly note that people could opt-out of these tools, but how soon before we see signs in restaurants saying “Green Status Customers Only?”

The pandemic requires a coordinated and strong response by government and society. Whether this response should include a wholesale trading of the privacy rights we held dear before the pandemic (such as our movements, health status, contacts, body temperature, etc.) is an open and serious question. We should pause and think before granting Big Tech and the government access to these highly personal data points. It is impossible and too early to tell whether granting this level of access to our personal data will be effective or what long term impact it will have on our privacy rights. One thing for sure, however, is that once this genie is out of the bottle, there is no going back. If you would like further information, please contact PLDO Partner Brian J. Lamoureux at 401-824-5155 or email bjl@pldolaw.com.

[This article was published in the Providence Journal, May 18, 2020 print edition, and can be accessed by clicking https://www.providencejournal.com/news/20200517/my-turn-brian-j-lamoureux-how-will-covid-19-tracking-apps-affect-privacy.]

 

 

Disclaimer: This blog post is for informational purposes only. This blog is not legal advice and you should not use or rely on it as such. By reading this blog or our website, no attorney-client relationship is created. We do not provide legal advice to anyone except clients of the firm who have formally engaged us in writing to do so. This blog post may be considered attorney advertising in certain jurisdictions. The jurisdictions in which we practice license lawyers in the general practice of law, but do not license or certify any lawyer as an expert or specialist in any field of practice.

Cybersecurity researchers have recently created a computer virus that was able to trick three radiologists into thinking their patients had cancer. The researchers were also able to fool automated screening systems by altering images and scans to place fake “tumors” on them, or by removing real tumors from actual scans which showed advanced disease. These images and scans were vulnerable because they were not digitally signed or encrypted. The researchers also found that while hospitals have done a good job in preventing the spread of confidential patient information outside their facilities, they have not done as well in handling data internally with care. This has left hospitals and other medical facilities vulnerable to intrusions by malware and other viruses that could wreak havoc on patients’ sensitive medical data.

Although it’s too early to tell what impact these vulnerabilities will have on patient care, it should be alarming to all of us on several levels. First, interpreting radiological scans requires a high level of skill and judgment in what is already a complex and nuanced process. If a patient or radiologist cannot trust the integrity of the data underlying a particular scan or image, then certainly any conclusions reached on it will be viewed with some level of skepticism. Second, as more people learn that malware and viruses can make it look like a patient has a tumor or cancer when he or she is perfectly healthy, we should expect a sharp increase in the number of patients seeking a second opinion or additional scans or tests because they do not trust the initial scan or interpretation, perhaps out of an understandable sense of grief or denial. This will cause increasing burdens and costs on an already over-burdened and expensive health care system.

Finally, we should expect to see more and more hospitals threatened by ransomware and cybercriminals who demand payment in exchange for not corrupting the hospital’s radiological system. Given the stakes in play – life, death, and potentially unnecessary treatment – hospitals and other medical providers would be well-advised to plan to prevent and address these new cyberthreats in the medical landscape. For further information, please contact Attorney Brian J. Lamoureux at bjl@pldolaw.com or 401-824-5155.

 

 

Disclaimer: This blog post is for informational purposes only. This blog is not legal advice and you should not use or rely on it as such. By reading this blog or our website, no attorney-client relationship is created. We do not provide legal advice to anyone except clients of the firm who have formally engaged us in writing to do so. This blog post may be considered attorney advertising in certain jurisdictions. The jurisdictions in which we practice license lawyers in the general practice of law, but do not license or certify any lawyer as an expert or specialist in any field of practice.

Ten years ago, the mention of a cyber security breach was a rarity. Today, reports of breach  incidents are almost commonplace. Every time a breach occurs in business the potential to cause significant harm and financial loss is mind-numbing. In addition, when the victim of a breach is a business, the owner is often held liable, especially if it is determined that the business owner did not take appropriate preventive steps to protect customer data. There are at least three high-profile cybersecurity breaches in recent memory that offer lessons for those struggling with cybersecurity issues.

In 2017 we learned of the Equifax breach, which stands out for several reasons:

  • First—while not the largest breach in history, it affected the Social Security information of more individual Americans than any other breach, compromising data for potentially half the population.
  • Second—the breach was against an agency focused specifically on safeguarding its customers’ personal information.
  • Third—the news continues to get worse. The latest reports from Business Insider indicate that the reported number of victims has risen from 143 million to 146.6 million, and more than 56,000 victims had specifically sensitive documents leaked—like driver’s license numbers and passport information.

From a personal standpoint, the Equifax leak reminded all Americans of the importance of protecting credit information, prompting millions to freeze their credit. From a business standpoint, it reminds us that no company doing business online is completely immune to hacking, and that businesses must be ever diligent in monitoring and guarding the personal information of others.

Another massive breach offering “teaching moments” occurred in 2016 to Uber, the world’s dominate on-demand rideshare taxi service. Hackers broke into Uber’s servers and stole the personal information of 57 million users and 600,000 drivers. To make matters worse, the company attempted to cover up the breach by paying $100,000 in ransom money to the hackers, which was reported by the New York Times. In addition, the hackers were able to lift the data from the company’s GitHub account, a development platform that should never have been used to store people’s personal information. Uber didn’t admit the breach for almost a year.

The Uber fiasco serves as a case study in what not to do for businesses entrusted with people’s personal information. First, keep the information in a safe place; and second, if the data is compromised, don’t try to cover it up.

Easily one of, if not the largest data breach event in history, is the Yahoo incident, which actually involved two separate hacks by different agencies (reportedly state-sponsored) in 2013 and 2014. Yahoo did not admit the incidents for several years. The initial report said 500 million users had been affected by the 2014 hack—already setting a record for its time. Later, the company revealed that an earlier breach had compromised the information of 1 billion users. By October 2017, Yahoo admitted the first breach had affected its entire user base—more than 3 billion people.

Since that time, Yahoo’s value has dropped considerably; once valued at more than $100 billion, the Internet part of the business was sold to Verizon for just under $4.5 billion. In April 2018, according to The Verge, the SEC fined Yahoo $35 million for the breach. Not only can a cybersecurity breach damage your customers—if you don’t manage the breach correctly, it can do serious damage to the value of your company, as well.

 

Disclaimer: This blog post is for informational purposes only. This blog is not legal advice and you should not use or rely on it as such. By reading this blog or our website, no attorney-client relationship is created. We do not provide legal advice to anyone except clients of the firm who have formally engaged us in writing to do so. This blog post may be considered attorney advertising in certain jurisdictions. The jurisdictions in which we practice license lawyers in the general practice of law, but do not license or certify any lawyer as an expert or specialist in any field of practice.

Introducing the little-known (but hugely important) General Data Protection Regulation (GDPR).  We know, we know. You’ve never heard of it. And, if you did hear of it, you don’t think it applies to your company, to your college, or to your work as a consultant. Well, sorry to say, you’re probably wrong. GDPR likely applies to any business with a computer connected to the Internet.

What is GDPR, you ask? GDPR is a new regulation adopted by the European Union (EU) which becomes effective on May 25, 2018. It is a game-changer. GDPR is the EU’s new law to protect personal data and enhance the privacy of its citizens online. Among other things, GDPR:

  • empowers its citizens to have more control and say over how and when their personal information is collected, used, analyzed, and stored.
  • sets limits on the processing of personal data to specific and legitimate business purposes.
  • gives EU citizens the right and authority to obtain any data you have gathered about them and to require you to fix any inaccuracies or even erase their data (known as the “right-to-be-forgotten”).
  • requires you to use adequate security measures to protect this data and to promptly notify users of any breaches (i.e., within 72 hours in most cases).

GDPR will be enforced by national regulators in the EU who will have the authority to assess mind-bogglingly substantial fines for violations (i.e., the greater of 4% of a company’s annual revenue or €20 million (or $24,137,900 as of the writing of this post). (Yes, you read that correctly.)

Do we have your attention yet?

“But wait a second,” you say. “How can this law apply to me when I’m in the United States? This is America! No taxation without representation!” Yes, we know. However, GDPR’s reach is far and wide. If you obtain or use personal data (which the GDPR defines broadly) of any EU citizen, then GDPR likely applies to you. Although it remains to be seen how the EU’s regulatory authorities will seek to enforce the GDPR against American companies, given the close working and trade relationship between the United States and the EU, we expect that international law will permit GDPR enforcers to reach U.S. companies. And, our understanding is that there will be no grace period and enforcement will begin promptly after the May 25, 2018 effective date.

So, now what?

It’s too complicated to lay out all the steps to ensure GDPR compliance in a blog post. Therefore, please consider this list as a very general and high-level set of guidelines to consider:

  • First, a company’s ownership or senior management should be made aware of GDPR’s existence and upcoming implementation.
  • Second, a company should assemble a GDPR Task Force consisting of senior management members, legal counsel (either outside counsel or in-house), the company’s IT professionals, the personnel responsible for handling the company’s website and data gathering/tracking, and perhaps the company’s marketing representative. The purpose of the Task Force would be to begin to understand GDPR and think through how the company’s data practices are impacted by it.
  • Third, a company may need the service of a cybersecurity consultant to come in and audit or assess the company’s systems to get a solid handle on the data the company tracks and stores.
  • Finally, a company should ask its insurance broker about the possibility of purchasing coverage for GDPR events and enforcement actions. As of this writing, it seems that the insurance market is in a great state of flux/uncertainty on this issue, most likely because of the staggering amount of financial exposure under the GDPR. We seriously question whether current cybersecurity insurance riders will be adequately broad enough to cover GDPR-related losses and fines.

Based upon our discussions and review of the marketplace, we believe only a small percentage of U.S. institutions are prepared for GDPR’s implementation and potential impacts. GDPR is here to stay and no one wants to be the first U.S. test case caught in the GDPR’s net. We know this all sounds daunting, but you should resist the temptation to ignore GDPR and hope you never get a certified letter from Berlin, Paris, or London. Compliance with GDPR is not impossible, but time is running out.

Disclaimer: All blog posts are for informational purposes only. This blog is not legal advice and you should not use or rely on it as such. By reading this blog or our website, no attorney-client relationship is created. We do not provide legal advice to anyone except clients of the firm who have formally engaged us in writing to do so. This blog post may be considered attorney advertising in certain jurisdictions. The jurisdictions in which we practice license lawyers in the general practice of law, but do not license or certify any lawyer as an expert or specialist in any field of practice.