Privacy

Digital tracking: Same concept, Different Era

Meibo Chen, MJLST Staffer

The term “paper trail” continues to become more anachronistic in today’s world as time goes on.  While there are some people who still prefer the traditional old-fashioned pen and paper, our modern world has endowed us with technologies like computers and smartphones.  Whether we like it or not, this digital explosion is slowly consuming and taking over the lives of the average American (73% of US adults own a desktop or laptop computer, and 68% own a smartphone).

These new technologies have forced us to re-consider many novel legal issues that arose from their integration into our daily lives.  Recent Supreme Court decisions such as Riley v. California in 2014 pointed out the immense data storage capacity of a modern cell phone, and requires a warrant for its search in the context of a criminal prosecution.  In the civil context, many consumers are concerned with internet tracking.  Indeed, the MJLST published an article in 2012 addressing this issue.

We have grown so accustomed to seeing “suggestions” that eerily match our respective interests.  In fact, internet tracking technology has become far more sophisticated than the traditional cookies, and can now utilizes “fingerprinting” technology to look at battery status or window size to identify a user’s presence or interest. This leads many to fear for their data privacy in similar digital settings.  However, isn’t this digital tracking just the modern adaptation to “physical” tracking that we have grown so accustomed to?

When we physically go to a grocery store, don’t we subject ourselves to the prying eyes of those around us?  Why should it be any different in a cyberspace context?  While seemingly scary accurate at times, “suggestions” or “recommended pages” based on one’s browsing history can actually be beneficial to both the tracked and the tracker.  The tracked gets more personalized search results while the tracker uses that information for better business results between him and the consumer.  Many browsers already sport the “incognito” function to disable the tracks, bring a balance to when consumers want their privacy.  Of course, this tracking technology can be misused, but malicious use of a beneficial technology has always been there in our world.


6th Circuit Aligns With 7th Circuit on Data Breach Standing Issue

John Biglow, MJLST Managing Editor

To bring a suit in any judicial court in the United States, an individual, or group of individuals must satisfy Article III’s standing requirement. As recently clarified by the Supreme Court in Spokeo, Inc. v. Robins, 136 S. Ct. 1540 (2016), to meet this requirement, a “plaintiff must have (1) suffered an injury in fact, (2) that is fairly traceable to the challenged conduct of the defendant, and (3) that is likely to be redressed by a favorable judicial decision.” Id. at 1547. When cases involving data breaches have entered the Federal Circuit courts, there has been some disagreement as to whether the risk of future harm from data breaches, and the costs spent to prevent this harm, qualify as “injuries in fact,” Article III’s first prong.

Last Spring, I wrote a note concerning Article III standing in data breach litigation in which I highlighted the Federal Circuit split on the issue and argued that the reasoning of the 7th Circuit court in Remijas v. Neiman Marcus Group, LLC, 794 F.3d 688 (7th Cir. 2015) was superior to its sister courts and made for better law. In Remijas, the plaintiffs were a class of individuals whose credit and debit card information had been stolen when Neiman Marcus Group, LLC experienced a data breach. A portion of the class had not yet experienced any fraudulent charges on their accounts and were asserting Article III standing based upon the risk of future harm and the time and money spent mitigating this risk. In holding that these Plaintiffs had satisfied Article III’s injury in fact requirement, the court made a critical inference that when a hacker steals a consumer’s private information, “[p]resumably, the purpose of the hack is, sooner or later, to make fraudulent charges or assume [the] consumers’ identit[y].” Id. at 693.

This inference is in stark contrast to the line of reasoning engaged in by the 3rd Circuit in Reilly v. Ceridian Corp. 664 F.3d 38 (3rd Cir. 2011).  The facts of Reilly were similar to Remijas, except that in Reilly, Ceridian Corp., the company that had experienced the data breach, stated only that their firewall had been breached and that their customers’ information may have been stolen. In my note, mentioned supra, I argued that this difference in facts was not enough to wholly distinguish the two cases and overcome a circuit split, in part due to the Reilly court’s characterization of the risk of future harm. The Reilly court found that the risk of misuse of information was highly attenuated, reasoning that whether the Plaintiffs experience an injury depended on a series of “if’s,” including “if the hacker read, copied, and understood the hacked information, and if the hacker attempts to use the information, and if he does so successfully.” Id. at 43 (emphasis in original).

Often in the law, we are faced with an imperfect or incomplete set of facts. Any time an individual’s intent is an issue in a case, this is a certainty. When faced with these situations, lawyers have long utilized inferences to differentiate between more likely and less likely scenarios for what the missing facts are. In the case of a data breach, it is almost always the case that both parties will have little to no knowledge of the intent, capabilities, or plans of the hacker. However, it seems to me that there is room for reasonable inferences to be made about these facts. When a hacker is sophisticated enough to breach a company’s defenses and access data, it makes sense to assume they are sophisticated enough to utilize that data. Further, because there is risk involved in executing a data breach, because it is illegal, it makes sense to assume that the hacker seeks to gain from this act. Thus, as between the Reilly and Remijas courts’ characterizations of the likelihood of misuse of data, it seemed to me that the better rule is to assume that the hacker is able to utilize the data and plans to do so in the future. Further, if there are facts tending to show that this inference is wrong, it is much more likely at the pleading stage that the Defendant Corporation would be in possession of this information than the Plaintiff(s).

Since Remijas, there have been two data breach cases that have made it to the Federal Circuit courts on the issue of Article III standing. In Lewert v. P.F. Chang’s China Bistro, Inc., 819 F.3d 963, 965 (7th Cir. 2016), the court unsurprisingly followed the precedent set forth in their recent case, Remijas, in finding that Article III standing was properly alleged. In Galaria v. Nationwide Mut. Ins. Co., a recent 6th Circuit case, the court had to make an Article III ruling without the constraint of an earlier ruling in their Circuit, leaving the court open to choose what rule and reasoning to apply. Galaria v. Nationwide Mut. Ins. Co., No. 15-3386, 2016 WL 4728027, (6th Cir. Sept. 12, 2016). In the case, the Plaintiffs alleged, among other claims, negligence and bailment; these claims were dismissed by the district court for lack of Article III standing. In alleging that they had suffered an injury in fact, the Plaintiffs alleged “a substantial risk of harm, coupled with reasonably incurred mitigation costs.” Id. at 3. In holding that this was sufficient to establish Article III standing at the pleading stage, the Galaria court found the inference made by the Remijas court to be persuasive, stating that “[w]here a data breach targets personal information, a reasonable inference can be drawn that the hackers will use the victims’ data for the fraudulent purposes alleged in Plaintiffs’ complaints.” Moving forward, it will be intriguing to watch how Federal Circuits who have not faced this issue, like the 6th circuit before deciding Galaria, rule on this issue and whether, if the 3rd Circuit keeps its current reasoning, this issue will eventually make its way to the Supreme Court of the United States.


The Comment on the Note “Best Practices for Establishing Georgia’s Alzheimer’s Disease Registry” of Volume 17, Issue 1

Jing Han, MJLST Staffer

Alzheimer’s disease (AD), also known just Alzheimer’s, accounts for 60% to 70% of cases of dementia. It is a chronic neurodegenerative disease that usually starts slowly and gets worse over time. The cause of Alzheimer’s disease is poorly understood. No treatments could stop or reverse its progression, though some may temporarily improve symptoms. Affected people increasingly rely on others for assistance, often placing a burden on the caregiver; the pressures can include social, psychological, physical, and economic elements. It was first described by, and later named after, German psychiatrist and pathologist Alois Alzheimer in 1906. In 2015, there were approximately 48 million people worldwide with AD. In developed countries, AD is one of the most financially costly diseases. Before many states, including Georgia, South Carolina, passed legislation establishing the Registry, many private institutions across the country already had made tremendous efforts to establish their own Alzheimer’s disease registries. The country has experienced an exponential increase of people who are diagnosed with Alzheimer’s disease. More and more states have begun to have their own Alzheimer’s disease registry.

From this Note, the Registry in Georgia has emphasized from the outset, the importance of protecting the confidentiality of patent date from secondary uses. This Note explores many legal and ethical issues raised by the Registry. An Alzheimer’s disease patient’s diagnosis history, medication history, and personal lifestyle are generally confidential information, known only to the physician and patient himself. Reporting such information to the Registry, however, may lead to wider disclosure of what was previously private information and consequently may arouse constitutional concerns. It is generally known that the vast majority of public health registries in the past have focused on collection of infectious disease data, registries for non-infectious diseases, such as Alzheimer’s disease, diabetes, and cancer have been recently created. It is a delicate balance between the public interest and personal privacy. It is not a mandatory requirement to register because Alzheimer is not infectious. After all, people suffering from Alzheimer’s often face violations of their human rights, abuse and neglect, as well as widespread discrimination from the other people. When a patient is diagnosed as AD, the healthcare provider, the doctor should encourage, rather than compel patients to receive registry. Keeping all the patients’ information confidential, enacting the procedural rules to use the information and providing some incentives are good approaches to encourage more patients to join the registry.

Based on the attention to the privacy concerns under federal and state law, the Note recommend slightly broader data sharing with the Georgia Registry, such as a physician or other health care provider for the purpose of a medical evaluation or treatment of the individual; any individual or entity which provides the Registry with an order from a court of competent jurisdiction ordering the disclosure of confidential information. What’s more, the Note mentions there has the procedural rules designated to administer the registry in Georgia. The procedural rules involve clauses: who are the end-users of the registry; what type of information should be collected in the registry; how and from whom should the information be collected; and how should the information be shared or disclosed for policy planning for research purpose; how the legal representatives get authority from patient.

From this Note, we have a deep understanding of Alzheimer’s disease registry in the country through one state’s experience. The registry process has invoked many legal and moral issues. The Note compares the registry in Georgia with other states and points out the importance of protecting the confidentiality of patient data. Emphasizing the importance of protection of personal privacy could encourage more people and more states to get involved in this plan.


Requiring Backdoors into Encrypted Cellphones

Steven Groschen, MJLST Managing Editor

The New York State Senate is considering a bill that requires manufacturers and operating system designers to create backdoors into encrypted cellphones. Under the current draft, failure to comply with the law would result in a $2,500 fine, per offending device. This bill highlights the larger national debate concerning privacy rights and encryption.

In November of 2015, the Manhattan District Attorney’s Office (MDAO) published a report advocating for a federal statute requiring backdoors into encrypted devices. One of MDAO’s primary reasons in support of the statute is the lack of alternatives available to law enforcement for accessing encrypted devices. The MDAO notes that traditional investigative techniques have largely been ineffective. Additionally, the MDAO argues that certain types of data residing on encrypted devices often cannot be found elsewhere, such as on a cloud service. Naturally, the inaccessibility of this data is a significant hindrance to law enforcement. The report offers an excellent summary of the law enforcement perspective; however, as with all debates, there is another perspective.

The American Civil Liberties Union (ACLU) has stated it opposes using warrants to force device manufacturers to unlock their customers’ encrypted devices. A recent ACLU blog post presented arguments against this practice. First, the ACLU argued that the government should not require “extraordinary assistance from a third party that does not actually possess the information.” The ACLU perceives these warrants as conscripting Apple (and other manufacturers) to conduct surveillance on behalf of the government. Second, the ACLU argued using search warrants bypasses a “vigorous public debate” regarding the appropriateness of the government having backdoors into cellphones. Presumably, the ACLU is less opposed to laws such as that proposed in the New York Senate, because that process involves an open public debate rather than warrants.

Irrespective of whether the New York Senate bill passes, the debate over government access to its citizens’ encrypted devices is sure to continue. Citizens will have to balance public safety considerations against individual privacy rights—a tradeoff as old as government itself.


Digital Millennium Copyright Act Exemptions Announced

Zach Berger, MJLST Staffer

The Digital Millennium Copyright Act (DMCA) first enacted in 1998, prevents owners of digital devices from making use of these devices in any way that the copyright holder does not explicitly permit. Codified in part in 17 U.S.C. § 1201, the DMCA makes it illegal to circumvent digital security measures that prevent unauthorized access to copyrighted works such has movies, video games, and computer programs. This law prevents users from breaking what is known as access controls, even if the purpose would fall under lawful fair use. According to the Electronic Frontier Foundation’s (a nonprofit digital rights organization) staff attorney Kit Walsh, “This ‘access control’ rule is supposed to protect against unlawful copying. But as we’ve seen in the recent Volkswagen scandal . . . it can be used instead to hide wrongdoing hidden in computer code.” Essentially, everything not explicitly permitted is forbidden.

However, these restrictions are not iron clad. Every three years, users are allowed to request exemptions to this law for lawful fair uses from the Library of Congress (LOC), but these exemptions are not easy to receive. In order to receive an exemption, activists must not only propose new exemptions, but also plead for ones already granted to be continued. The system is flawed, as users often need to have a way to circumvent their devices to make full use of the products. However, the LOC has recently released its new list of exemptions, and this expanded list represents a small victory for digital rights activists.

The exemptions granted will go into effect in 2016, and cover 22 types of uses affecting movies, e-books, smart phones, tablets, video games and even cars. Some of the highlights of the exemptions are as follows:

  • Movies where circumvention is used in order to make use of short portions of the motion pictures:
    • For educational uses by University and grade school instructors and students.
    • For e-books offering film analysis
    • For uses in noncommercial videos
  • Smart devices
    • Can “jailbreak” these devices to allow them to interoperate with or remove software applications, allows phones to be unlocked from their carrier
    • Such devices include, smart phones, televisions, and tablets or other mobile computing devices
      • In 2012, jailbreaking smartphones was allowed, but not tablets. This distinction has been removed.
    • Video Games
      • Fan operated online servers are now allowed to support video games once the publishers shut down official servers.
        • However, this only applies to games that would be made nearly unplayable without the servers.
      • Museums, libraries, and archives can go a step further by jailbreaking games as needed to get them functioning properly again.
    • Computer programs that operate things primarily designed for use by individual consumers, for purposes of diagnosis, repair, and modification
      • This includes voting machines, automobiles, and implantation medical devices.
    • Computer programs that control automobiles, for purposes of diagnosis, repair, and modification of the vehicle

These new exemptions are a small, but significant victory for consumers under the DMCA. The ability to analyze your automotive software is especially relevant in the wake of the aforementioned Volkswagen emissions scandal. However, the exemptions are subject to some important caveats. For example, only video games that are almost completely unplayable can have user made servers. This means that games where only an online multiplayer feature is lost, such servers are not allowed. A better long-term solution is clearly needed, as this burdensome process is flawed and has led to what the EFF has called “unintended consequences.” Regardless, as long as we still have this draconian law, exemptions will be welcomed. To read the final rule, register’s recommendation, and introduction (which provides a general overview) click here.


“DRONE WARS”: THE BATTLE FOR MIDWESTERN SKIES

Travis Waller, MJLST Staffer

Given the new Star Wars: The Force Awakens film upcoming this December, introducing a discussion on recent policies involving drone regulation seemed like a worthwhile addition to this week’s blog.

While the robotic “drones” of our day and age are certainly not cut from the same titanium alloy as George Lucas’ quasi-humanoid “droid” characters in many of his films, North Dakota may well be on it’s way to starting it’s own “robotic army” of sorts.

A friend and colleague from the University of Connecticut School of Law brought to my attention an article by Ben Woods, discussing the 2015 ND House Bill proposing the arming of drones with “non-lethal weaponry” for police functions. With the shocking amount of police deaths reported in this country last year, North Dakota may well be leading the way in finding an innovative alternative to placing human officers in potentially dangerous confrontations. However, this benefit does not come without a cost. As presented in a segment by Ashley Maas of the NY Times, drone regulation is still up in the air (excuse the pun). Only within the last year has the FAA determined that they are able to take action against civilian violators of drone regulations.

Moreover, with recent reports involving the hacking of automated vehicles, as well as Maas’ examples of civilians using drone technology for less than constructive purposes, placing dangerous technology on these machines may well develop into a major public policy concern.

While it is this author’s humble opinion that a fair amount of time exists before we, as a people, need be concerned with an Invasion of Naboo type situation, this may be exactly the type of situation where more time is needed to allow for the security measures around the technology, as well as the legal infrastructure surrounding drone regulation, to catch up to the state legislatures hopes for drone usage. As the matter stands now, allowing drones to be used in a police capacity risks a host of possible problems, including potential 4th amendment violations, and even increasing an already shockingly high risk of civilian causalities related to police activity.

With the law having already gone into effect on August 1st of this year, we will just have to wait and see how these issues play out.

Until next time,

-Travis

*Special Thanks to Monica Laskos, University of Connecticut School of Law ’17, for the idea to pursue this topic.


Digital Privacy in Autonomous Vehicles

Steven Groschen, MJLST Managing Editor

The introduction of autonomous vehicles is likely to have a widespread effect on laws related to road travel. Theoretically, a well-functioning driverless car will never speed or run a red light. Thus, driverless cars are less likely to be pulled over. But what if an autonomous vehicle is pulled over and the officer wishes to perform a search of the automated system? Clues to how a court might handle this scenario are contained in Riley v. California.

Riley v. California, 134 S.Ct. 2473 (2014), explored the amount of protection digital content residing on an electronic device receives from unreasonable searches and seizures during a lawful arrest. The Supreme Court examined two independent fact patterns involving police officers searching an arrestee’s cellphone without a warrant. In the first fact pattern, an officer seized an individual’s cellphone in the course of an arrest and proceeded to electronically search through the contact list and pictures on the device. This search yielded evidence of gang related activity which was later used to convict the individual. In the second fact pattern, a police officer searched the phone of an individual, whom was also under arrest, and located a contact entry titled “my house.” The police used the phone number in the contact entry to discover the arrestee’s address. This information and a few other pieces of evidence taken from the phone helped the police secure a warrant to search the arrestee’s home.

The Riley decision made two holdings potentially relevant to autonomous cars. First, the court held that during a lawful arrest a warrant is generally required before searching the digital content on a cellphone. Second, the court suggested this protection is for the digital content and not necessarily the cellphone itself. These holdings can be interpreted as providing protection for digital content contained within automated driving systems. As a result, a plausible argument exists that, in the future, an officer will need a warrant before searching the digital content of an autonomous vehicle.

Predicting with any level of certainty how a court will handle digital content on an autonomous vehicle is difficult. Nonetheless, the discussion is important because autonomous vehicles are likely to become ubiquitous on the roadways in the next few decades. These vehicles will contain sensitive information such as route history and a log of the car’s actions. It is important to continue debating what privacy rights owners can and should expect regarding their future cars.

For an in-depth look at Riley and its implications for digital content contained in autonomous vehicles, see Sarah Aue Palodichuk’s article entitled “Driving into the Digital Age: How SDVs Will Change the Law and Its Enforcement.”


The Shift Toward Data Privacy: Workplace, Evidence, and Death

<Ryan Pesch, MJLST Staff Member

I’m sure I am not alone in remembering the constant urgings to be careful what I post online. I was told not to send anything in an email I wouldn’t want made public, and I guess it made some sense that the internet was commonly viewed as a sort of public forum. It was the place teens went to be relieve their angst, to post pictures, and to exchange messages. But the demographic of people that use the internet is constantly growing. My mom and sister communicate their garden interests using Pinterest (despite the fact that my mom needs help to download her new podcasts), and as yesterday’s teens become today’s adults, what people are comfortable putting online continues to expand. For example, the advent of online finances illustrate that the online world is about so much more than frivolity. The truth of the matter is that the internet shapes the way we think about ourselves. And as Lisa Durham Taylor observed in her article for MJLST in the spring of 2014, the courts are taking notice.

The article concerns the role of internet privacy in the employment context, noting that where once a company could monitor its employee’s computer activity with impunity (after all, it was being done on the company time and with company resources), courts have recently realized that the internet stands for more than dalliance. In it, Taylor notes that the connectedness of employees brings with it both advantages and disadvantages to the corporation. It both helps and hinders productivity, offering a more efficient way of accomplishing a task, but providing the material for procrastination in an accompanying hand. When the line blurs, and people start using company time for personal acts, the line-drawing can get tricky. Companies have an important interest in preserving the confidentiality of their work, but courts have recently been drawing the lines to favor the employee over the employer. This is in stark contrast to the early decisions, which gave companies a broad right to discharge an “at-will” employee and found that there was no expectation of privacy in the workplace. Luckily, courts are beginning to recognize that the nature of a person’s online interactions make the company’s snooping more analogous to going through an employee’s personal possessions than it is to monitoring an employee’s efficiency.

I would add into the picture the recently-decided Supreme Court case of Riley v. California, where the Court held that a police needed a warrant to search a suspect’s phone. The Court said that there was not reasonable cause to search a cell phone because the nature of the technology means that the police would be violating more than necessary to conduct normal business. They likened it to previous restrictions which prevented police from searching locked possessions incident to arrest, and sarcastically observed that cell phones have become “such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy.” The “vast quantities of personal information” and the fact that the phone itself is not a weapon make its taking unjustified in the course of a normal search.

This respect for the data of individuals seems to be signaling a new and incredibly complicated age of law. When does a person have the right to protect their data? When can that protection be broken? As discussed in a recent post on this blog, there is an ongoing debate about what to do with the data of decedents. To me, a conservative approach makes the most sense, especially in context with the cases discussed by Lisa Taylor and the decision in Riley v. California. However, courts have sided with those seeking access because the nature of a will grants the property of the deceased to the heirs, which has been extended to online “property.” What Rebecca Cummings points out to help swing the balance back in favor of privacy, is that it is not just the property of the deceased to which you are granting access. The nature of email means that a person’s inbox has copies of letters from others which may have never been intended for the eyes of someone else.

I can only imagine the number of people who, had they the presence of mind to consider this eventuality, would act differently either in the writing of their will or their management of their communications. I am sure that this is already something lawyers advise their clients about when discussing their plans for their estate, but for many, death comes before they have the chance to fully consider these things. As generations who have grown up on the internet start to encounter the issue in earnest, I have no doubt that the message will spread, but I can’t help but feel it should be spreading already. So: what would your heirs find tucked away in the back of your online closet? And if the answer to that is something you’d rather not think about, perhaps we should support the shift to privacy in more aspects of the digital world.


I’m Not a Doctor, But…: E-Health Records Issues for Attorneys

Ke Huang, MJLST Lead Articles Editor

The Health Information Technology for Economic and Clinical Health Act of 2009 (HITECH Act) generally provides that, by 2015, healthcare providers must comply with the Act’s electronic health record (EHR) benchmarks, or, the government would reduce these providers’ Medicare payments by one percent.

These provisions of the HITECH Act are more than a health policy footnote. Especially for attorneys, the growing use of EHRs raises several legal issues. Indeed, in Volume 10, Issue 1 of the Minnesota Journal of Law, Science & Technology, published six years ago, Kari Bomash analyzes the consequence of EHRs in three legal-related aspects. In Privacy and Public Health in the Information Age, Bomash discusses how a Minnesota Health Records Act amendment relates to: (1) privacy, especially consent of patients, (2) data security (Bomash was almost prescient given the growing security concerns), and (3) data use regulations that affect medical doctors.

Bomash’s discussion is not exhaustive. EHRs also raise legal issues running the gamut of intellectual property, e-discovery, to malpractice. Given that software runs EHRs, IP industry is very much implicated. So much so that some proponents of EHR even support open source. (Another MJLST Article explains the concept of open source.)

E-discovery may be more straightforward. Like other legal parties maintaining electronic stored information, health entities storing EHR must comply with court laws governing discovery.

And malpractice? One doctor suggested in a recent Wall Street Journal op-ed that EHR interferes with a doctor’s quality of care. Since quality of care, or lack thereof, is correlated with malpractice actions, commentators raised the concern that EHR could raise malpractice actions. A 2010 New England Journal of Medicine study addressed this topic but could not provide a conclusive answer.

Even my personal experience with EHRs is one of the reasons that lead me to want to become an attorney. As a child growing up in an immigrant community, I often accompanied adult immigrants, to interpret in contract closings, small-business transactions, and even clinic visits. Helping in those matters sparked my interest in law. In one of the clinic visits, I noticed that an EHR print-out of my female cousin stated that she was male. I explained the error to her.

“I suppose you have to ask them to change it, then,” she said.

I did. I learned from talking to the clinic administrator the EHR software was programmed to recognize female names, and, for names that were ambiguous, as was my cousin’s, the software automatically categorized the patient as male. Even if my cousin’s visit was for an ob-gyn check-up.


Stuck in Between a Rock and a Genomic Hard Place

Will Orlady, MJLST Staff Member

In Privatizing Biomedical Citizenship: Risk, Duty, and Potential in the Circle of Pharmaceutical Life, Professor Jonathan Khan wrote: “genomic research is at an impasse.” Though genomic research has advanced incrementally since the completion of the first draft of the human genome, Khan asserts, “few of the grandest promises of genomics have materialized.” This apparent lack of progress is a complex issue. Further, one may be left asking whether, within the current economic and regulatory scheme, genomics actually has promising answers to give. But Khan’s work cites to biomedical researchers, claiming that what is needed to propel genomic research forward is simple: more bodies.

Indeed, it is a simple answer, but to which question–or questions? Khan’s article explores the “interconnections among five . . . federally sponsored biomedical initiatives of the past decade in order to illuminate critical aspects of the current drive to get bodies.” To be sure, the article provides the literature with a fine starting analysis of public biomedical programs, synthesizing much of the previous research on biomedical research participation. It further evaluates previously proposed methods for increasing genomic research participation. Khan’s article, however, left me with more questions than answers. If the public and private sectors cannot work together to produce results, then who is left to ensure progress? Is progress currently feasible? Are we being too hasty and impatient demanding results from an admittedly young scientific discipline? And, ultimately, if study participants/subjects are expected to participate with their own genetic material or bodies, what do they get in return?

Khan’s article attempts to address the final question. That is, if we are to create a legal or social obligation to contribute to genomic research for the sake of the public, what benefit (or, at the least, what safety assurance) do contributors receive in return for their contribution? Clearly, issues associated with creating a system of duties while providing no corresponding rights are aplenty. Underlying this discussion is the notion that to ensure the timely progress of genomic research mandated participation in such research might be necessary. Herein lies a problem: “[t]hese duties effectively privatize citizenship, recasting service to the political community as a function of service to [an] . . . enterprise of biomedical research. . . . ” What is more, Khan is keen to point out that time and time again, promises of genomic advancement in the hands of collaborating private and public entities have failed to produce promised results.

If we are to go forward privatizing citizenship, creating duties for persons to use their bodies for the benefit of society, we must be careful to ensure that (1) individual rights in the outcome of the research are secured; and, (2) that society will in fact benefit from the collectively imposed obligations.

Although Khan’s article leaves many questions unanswered, I empathize with his weariness of creating a public duty to contribute to biomedical research. Solutions to such complex issues are not easily answered. Torpid genomic research is troubling. But, so is the notion of privatized citizenship ascribing duties without granting corresponding rights. Though more bodies may be needed to further the timely advance genomic research, policymakers academics alike should be cautious creating any programs which compromise the integrity of personal privacy for the sake of public advancement without granting corresponding rights.