Showing posts with label surveillance. Show all posts
Showing posts with label surveillance. Show all posts

Tuesday, April 29, 2014

Spring Edition of CCR's Massive Round-Up of New Law Articles on the CFAA, Cybercrime, Privacy, 4th Amendment, Surveillance, and more

Some impressive articles have been published since the last round-up I did in February; if you missed that post, see: Massive round-up of new law articles, covering privacy, Fourth Amendment, GPS, cell site, cybercrime, big data, revenge porn, drones, and more

New Legal Scholarship (with abstracts where available)

Orin S. Kerr, The Next Generation Communications Privacy Act, 162 U. Pa. L. Rev. 373 (2014)
In 1986, Congress enacted the Electronic Communications Privacy Act (ECPA) to regulate government access to Internet communications and records. ECPA is widely regarded as outdated, and ECPA reform is now on the Congressional agenda. At the same time, existing reform proposals retain the structure of the 1986 Act and merely tinker with a few small aspects of the statute. This Article offers a thought experiment about what might happen if Congress were to repeal ECPA and enact a new privacy statute to replace it. 
The new statute would look quite different from ECPA because overlooked changes in Internet technology have dramatically altered the assumptions on which the 1986 Act was based. ECPA was designed for a network world with high storage costs and only local network access. Its design reflects the privacy threats of such a network, including high privacy protection for real-time wiretapping, little protection for noncontent records, and no attention to particularity or jurisdiction. Today’s Internet reverses all of these assumptions. Storage costs have plummeted, leading to a reality of almost total storage. Even U.S.-based services now serve a predominantly foreign customer base. A new statute would need to account for these changes. 
This Article contends that a next generation privacy act should contain four features. First, it should impose the same requirement on access to all contents. Second, it should impose particularity requirements on the scope of disclosed metadata. Third, it should impose minimization rules on all accessed content. And fourth, it should impose a two-part territoriality regime with a mandatory rule structure for U.S.-based users and a permissive regime for users located abroad.

**And a Response to Kerr's article: Ryan Calo, Communications Privacy for and by Whom?, 162 U. Pa. L. Rev. Online 231 (2014) **
Andrea M. Matwyshyn, Privacy, the Hacker Way, 87 S. Cal. L. Rev. 1 (2014)
This Article seeks to clarify the relationship between contract law and promises of privacy and information security. It challenges three commonly held misconceptions in privacy literature regarding the relationship between contract and data protection—the propertization fatalism, the economic value fatalism, and the displacement fatalism—and argues in favor of embracing contract law as a way to enhance consumer privacy. Using analysis from Sorrell v. IMS Health Inc., marketing theory, and the work of Pierre Bourdieu, it argues that the value in information contracts is inherently relational: consumers provide “things of value”—rights of access to valuable informational constructs of identity and context—in exchange for access to certain services provided by the data aggregator. This Article presents a contract-based consumer protection approach to privacy and information security. Modeled on trade secret law and landlord-tenant law, it advocates for courts and legislatures to adopt a “reasonable data stewardship” approach that relies on a set of implied promises—nonwaivable contract warranties and remedies—to maintain contextual integrity of information and improve consumer privacy. 
Matthew F. Meyers, GPS “Bullets” and the Fourth Amendment, 4 Wake Forest L. Rev. Online 18 (2014) (No Abstract)

From the Fordham Law Review, April 2014 | Vol. 82, No. 5:
Peter Margulies, The NSA in Global Perspective: Surveillance, Human Rights, and International Counterterrorism (No Abstract)
Casey J. McGowan, The Relevance of Relevance: Section 215 of the USA PATRIOT Act and the NSA Metadata Collection Program
In June 2013, a National Security Agency (NSA) contractor, Edward Snowden, leaked classified documents exposing a number of secret government programs. Among these programs was the “telephony metadata” collection program under which the government collects records from phone companies containing call record data for nearly every American. News of this program created considerable controversy and led to a wave of litigation contesting the validity of the program. 
The legality of the metadata collection program has been challenged on both constitutional and statutory grounds. The program derives its authority from Section 215 of the USA PATRIOT Act, codified as 50 U.S.C. § 1861. The statute requires that there be reasonable grounds to believe the data collected is “relevant to an authorized investigation.” The government deems all these records “relevant” based on the fact that they are used to find patterns and connections in preventing terrorist activity. Critics of the program, however, assert that billions of records cannot possibly be relevant when a negligible portion of those records are actually linked to terrorist activity. This Note examines the conflicting interpretations of “relevant,” and concludes that while the current state of the law permits bulk data collection, the power of the NSA to collect records on such a large scale must be reined in.
Thomas Rosso, Unlimited Data?: Placing Limits on Searching Cell Phone Data Incident to a Lawful Arrest 
The “search incident to arrest exception” is one of several exceptions to the general requirement that police must obtain a warrant supported by probable cause before conducting a search. Under the exception, an officer may lawfully search an arrestee’s person and the area within the arrestee’s immediate control without a warrant or probable cause, so long as the search is conducted contemporaneously with the lawful arrest. The U.S. Supreme Court has justified the exception based on the need for officers to discover and remove any weapons or destructible evidence that may be within the arrestee’s reach. Additionally, the Court has held that, under the exception, police may search any containers found on the arrestee’s person without examining the likelihood of uncovering weapons or evidence related to the arrestee’s offense. In light of these principles, should the exception permit officers to search the data of a cell phone found on an arrestee’s person? 
In January 2014, the Supreme Court granted certiorari to review two appellate rulings and resolve a split among the circuits and state courts on this question. This Note examines three approaches courts have taken to resolve the issue: a broad approach, a middle approach, and a narrow approach. This Note argues that the Supreme Court should adopt the narrow approach and prohibit warrantless searches of cell phone data under the exception.
Stephen Moor, Cyber Attacks and the Beginnings of an International Cyber Treaty, North Carolina Journal of International Law and Commercial Regulation (Fall 2013) (No Abstract)

Katherine Booth Wellington, Cyberattacks on Medical Devices and Hospital Networks: Legal Gaps and Regulatory Solutions, 30 Santa Clara High Tech. L.J. 139 (2014)
Cyberattacks on medical devices and hospital networks are a real and growing threat. Malicious actors have the capability to hack pacemakers and insulin pumps, shut down hospital networks, and steal personal health information. This Article analyzes the laws and regulations that apply to cyberattacks on medical devices and hospital networks and argues that the existing legal structure is insufficient to prevent these attacks. While the Computer Fraud and Abuse Act and the Federal Anti-Tampering Act impose stiff penalties for cyberattacks, it is often impossible to identify the actor behind a cyberattack—greatly decreasing the deterrent power of these laws. Few laws address the role of medical device manufacturers and healthcare providers in protecting against cyberattacks. While HIPAA incentivizes covered entities to protect personal health information, HIPAA does not apply to most medical device manufacturers or cover situations where malicious actors cause harm without accessing personal health information. Recent FDA draft guidance suggests that the agency has begun to impose cybersecurity requirements on medical device manufacturers. However, this guidance does not provide a detailed roadmap for medical device cybersecurity and does not apply to healthcare providers. Tort law may fill in the gaps, although it is unclear if traditional tort principles apply to cyberattacks. New legal and regulatory approaches are needed. One approach is industry self-regulation, which could lead to the adoption of industry-wide cybersecurity standards and lay the groundwork for future legal and regulatory reform. A second approach is to develop a more forward-looking and flexible FDA focus on evolving cybersecurity threats. A third approach is a legislative solution. Expanding HIPAA to apply to medical device manufacturers and to any cyberattack that causes patient harm is one way to incentivize medical device manufactures and healthcare providers to adopt cybersecurity measures. All three approaches provide a starting point for considering solutions to twenty-first century cybersecurity threats.
Merritt Baer, Who is the Witness to an Internet Crime: The Confrontation Clause, Digital Forensics, and Child Pornography, 30 Santa Clara High Tech. L.J. 31 (2014)
The Sixth Amendment’s Confrontation Clause guarantees the accused the right to confront witnesses against him. In this article I examine child pornography prosecution, in which we must apply this constitutional standard to digital forensic evidence. I ask, “Who is the witness to an Internet crime?” 
The Confrontation Clause proscribes the admission of hearsay. In Ohio v. Roberts, the Supreme Court stated that the primary concern was reliability and that hearsay might be admissible if the reliability concerns were assuaged. Twenty-four years later, in Crawford v. Washington, the Supreme Court repositioned the Confrontation Clause of the Sixth Amendment as a procedural right. Even given assurances of reliability, “testimonial” evidence requires a physical witness. 
This witness production requirement could have been sensible in an era when actions were physically tied to humans. But in an Internet age, actions may take place at degrees removed from any physical person. 
The hunt for a witness to digital forensic evidence involved in child pornography prosecution winds through a series of law enforcement protocols, on an architecture owned and operated by private companies. Sentencing frameworks associated with child pornography similarly fail to reflect awareness of the way that actions occur online, even while they reinforce what is at stake. 
The tensions I point to in this article are emblematic of emerging questions in Internet law. I show that failing to link the application of law and its undergirding principles to a digital world does not escape the issue, but distorts it. This failure increases the risk that our efforts to preserve Constitutional rights are perverted or made impotent.
Yana Welinder, Facing Real-Time Identification in Mobile Apps & Wearable Computers, 30 Santa Clara High Tech. L.J. 89 (2014)
The use of face recognition technology in mobile apps and wearable computers challenges individuals’ ability to remain anonymous in public places. These apps can also link individuals’ offline activities to their online profiles, generating a digital paper trail of their every move. The ability to go off the radar allows for quiet reflection and daring experimentation—processes that are essential to a productive and democratic society. Given what we stand to lose, we ought to be cautious with groundbreaking technological progress. It does not mean that we have to move any slower, but we should think about potential consequences of the steps that we take. 
This article maps out the recently launched face recognition apps and some emerging regulatory responses to offer initial policy considerations. With respect to current apps, app developers should consider how the relevant individuals could be put on notice given that the apps will not only be using information about their users, but also about the persons being identified. They should also consider how the apps could minimize their data collection and retention and keep the data secure. Today’s face recognition apps mostly use photos from social networks. They therefore call for regulatory responses that consider the context in which users originally shared the photos. Most importantly, the article highlights that the Federal Trade Commission’s first policy response to consumer applications that use face recognition did not follow the well-established principle of technology neutrality. The article argues that any regulation with respect to identification in real time should be technology neutral and narrowly address harmful uses of computer vision without hampering the development of useful applications. 
Valerie Redmond, Note, I Spy with My Not So Little Eye: A Comparison of Surveillance Law in the United States and New Zealand, 37 Fordham Int’l L.J. 733 (2014) (No Abstract)

Lawrence Rosenthal, Binary Searches and the Central Meaning of the Fourth Amendment, 22 Wm. & Mary Bill Rts. J. 881 (2014) (No Abstract)

Jason P. Nance, School Surveillance and the Fourth Amendment, 2014 Wisc. L. Rev. 79 (2014)
In the aftermath of several highly publicized incidents of school violence, public school officials have increasingly turned to intense surveillance methods to promote school safety. The current jurisprudence interpreting the Fourth Amendment generally permits school officials to employ a variety of strict measures, separately or in conjunction, even when their use creates a prison-like environment for students. Yet, not all schools rely on such strict measures. Recent empirical evidence suggests that low-income and minority students are much more likely to experience intense security conditions in their schools than other students, even after taking into account factors such as neighborhood crime, school crime, and school disorder. These empirical findings are problematic on two related fronts. First, research suggests that students subjected to these intense surveillance conditions are deprived of quality educational experiences that other students enjoy. Second, the use of these measures perpetuates social inequalities and exacerbates the school-to-prison pipeline.    
Under the current legal doctrine, students have almost no legal recourse to address conditions creating prison-like environments in schools. This Article offers a reformulated legal framework under the Fourth Amendment that is rooted in the foundational Supreme Court cases evaluating students’ rights under the First, Fourth, and Fourteenth Amendments. The historical justification courts invoke to abridge students’ constitutional rights in schools, including their Fourth Amendment rights, is to promote the educational interests of the students. This justification no longer holds true when a school creates a prison-like environment that deteriorates the learning environment and harms students’ educational interests. This Article maintains that in these circumstances, students’ Fourth Amendment rights should not be abridged but strengthened.
Meredith Mays Espino, Sometimes I Feel Like Somebody’s Watching Me . . . Read?: A Comment On The Need For Heightened Privacy Rights For Consumers Of Ebooks, 30 J. Marshall J. Info. Tech. & Privacy L. 281 (2013)

Emily Katherine Poole, Hey Girls, Did You Know? Slut-Shaming on the Internet Needs to Stop, 48 USF L. Rev. 221 (2013)
When it comes to sexual expression, females are denied the freedoms enjoyed by males. Even though sexual acts often take both a male and a female, it is the girl that faces society’s judgment when her behavior is made public. The Internet has created a forum for such "slut shaming" to occur on a whole new level. Now when a girl is attacked for her sexuality, her attackers can be spread across the U.S., or even the world. The Internet is an incredible resource for sharing and gaining information, but it is also allowing attacks on female sexuality to flourish.  
While slut shaming can and does occur to females of all ages, this Articles focuses on its prevalence among teen and preteen girls, falling under the umbrella of cyberbullying. Because actions and legislation that address cyber slut-shaming can also remedy other types of cyberbullying, the problems and proposed solutions elaborated in this Article can be expanded to include all types of cyberbullying. I chose to focus on one specific and pervasive harm — that caused by sexual shaming — to help bring attention to both the repercussions of cyberbullying and to the broader problem of gender inequality that persists in forums and social networking sites across the Internet. 
Sprague, Robert, No Surfing Allowed: A Review and Analysis of Legislation Prohibiting Employers from Demanding Access to Employees’ and Job Applicants’ Social Media Accounts (January 31, 2014). Albany Law Journal of Science and Technology, Vol. 24, 2014
This article examines recent state legislation prohibiting employers from requesting username and password information from employees and job applicants in order to access restricted portions of those employees’ and job applicants’ personal social media accounts. This article raises the issue of whether this legislation is even needed, from both practical and legal perspectives, focusing on: (a) how prevalent the practice is of requesting employees’ and job applicants’ social media access information; (b) whether alternative laws already exist which prohibit employers from requesting employees’ and job applicants’ social media access information; and (c) whether any benefits can be derived from this legislative output. After analyzing the potential impact of this legislation on employees, job applicants, and employers, this article concludes that such legislation is but an answer seeking a problem and raises more questions than it answers.
From the Washington University Law Review, Volume 89| Number 1| March 2014
Danielle Keats Citron & Frank Pasquale, The Scored Society: Due Process for Automated Predictions, 89 Wash. L. Rev. 1 
Big Data is increasingly mined to rank and rate individuals. Predictive algorithms assess whether we are good credit risks, desirable employees, reliable tenants, valuable customers—or deadbeats, shirkers, menaces, and “wastes of time.” Crucial opportunities are on the line, including the ability to obtain loans, work, housing, and insurance. Though automated scoring is pervasive and consequential, it is also opaque and lacking oversight. In one area where regulation does prevail—credit—the law focuses on credit history, not the derivation of scores from data.  
Procedural regularity is essential for those stigmatized by “artificially intelligent” scoring systems. The American due process tradition should inform basic safeguards. Regulators should be able to test scoring systems to ensure their fairness and accuracy. Individuals should be granted meaningful opportunities to challenge adverse decisions based on scores miscategorizing them. Without such protections in place, systems could launder biased and arbitrary data into powerfully stigmatizing scores. 
Elizabeth E. Joh, Policing by Numbers: Big Data and the Fourth Amendment, 89 Wash. L. Rev. 35 
The age of “big data” has come to policing. In Chicago, police officers are paying particular attention to members of a “heat list”: those identified by a risk analysis as most likely to be involved in future violence. In Charlotte, North Carolina, the police have compiled foreclosure data to generate a map of high-risk areas that are likely to be hit by crime. In New York City, the N.Y.P.D. has partnered with Microsoft to employ a “Domain Awareness System” that collects and links information from sources like CCTVs, license plate readers, radiation sensors, and informational databases. In Santa Cruz, California, the police have reported a dramatic reduction in burglaries after relying upon computer algorithms that predict where new burglaries are likely to occur. Unlike the data crunching performed by Target, Walmart, or Amazon, the introduction of big data to police work raises new and significant challenges to the regulatory framework that governs conventional policing. This article identifies three uses of big data and the questions that these tools raise about conventional Fourth Amendment analysis. Two of these examples, predictive policing and mass surveillance systems, have already been adopted by a small number of police departments around the country. A third example — the potential use of DNA databank samples — presents an untapped source of big data analysis. While seemingly quite distinct, these three examples of big data policing suggest the need to draw new Fourth Amendment lines now that the government has the capability and desire to collect and manipulate large amounts of digitized information. 
Lawrence B. Solum, Artificial Meaning, 89 Wash. L. Rev. 69  (No Abstract)
Harry Surden, Machine Learning and the Law, 89 Wash. L. Rev. 87 (No Abstract)
David C. Vladeck, Machines Without Principals: Liability Rules and Artificial Intelligence, 89 Wash. L. Rev. 117 (No Abstract)
All of Volume 40, Issue 2 of the William Mitchell Law Review: Legal Issues in a World of Electronic Data, which includes the following articles:
Roland L. Trope and Stephen J. Humes, Before Rolling Blackouts Begin: Briefing Boards on Cyber Attacks That Target and Degrade the Grid 
Damien Riehl and Jumi Kassim, Is “Buying” Digital Content Just “Renting” for Life? Contemplating a Digital First-Sale Doctrine 
Stephen T. Middlebrook and Sarah Jane Hughes, Regulating Cryptocurrencies in the United States: Current Issues and Future Directions 
Nathan Newman, The Costs of Lost Privacy: Consumer Harm and Rising Economic Inequality in the Age of Google
Slobogin, Christopher, Panvasive Surveillance, Political Process Theory and the Nondelegation Doctrine (April 23, 2014). Georgetown Law Journal, Vol. 102, 2014; Vanderbilt Public Law Research Paper No. 14-13 (SSRN)
Using the rise of the surveillance state as its springboard, this Article makes a new case for the application of administrative law principles to law enforcement. It goes beyond asserting, as scholars of the 1970s did, that law enforcement should be bound by the types of rules that govern other executive agencies, by showing how the imperative of administrative regulation flows from a version of John Hart Ely’s political process theory and principles derived from the closely associated nondelegation doctrine. Part I introduces the notion of panvasive law enforcement — large-scale police actions that are not based on individualized suspicion — and exposes the incoherence of the Supreme Court’s “special needs” treatment of panvasive investigative techniques under the Fourth Amendment. It then contrasts the Court’s jurisprudence, and the variations of it proposed by scholars, to the representation-reinforcing alternative suggested by Ely’s work, which would require that panvasive searches and seizures be approved by a body that is representative of the affected group and be applied evenly. Part II explores the impact of political process theory on panvasive surveillance that is not currently considered a search or seizure under the Fourth Amendment, using fusion centers, camera surveillance, drone flights and the NSA’s metadata program as examples. Part III mines administrative law principles to show how the rationale underlying the nondelegation doctrine — if not the (supposedly moribund) doctrine itself — can help ensure that the values of representative democracy and transparency are maintained even once control over panvasive surveillance is largely ceded to the Executive Branch.

Kerr, Orin S., The Fourth Amendment and the Global Internet (April 23, 2014). Stanford Law Review, Vol. 65, 2015, Forthcoming (SSRN)
This article considers how Fourth Amendment law should adapt to the increasingly worldwide nature of Internet surveillance. It focuses on two types of problems not yet addressed by courts. First, the Supreme Court’s decision in United States v. Verdugo-Urquidez prompts several puzzles about how the Fourth Amendment treats monitoring on a global network where many lack Fourth Amendment rights. For example, can online contacts help create those rights? What if the government mistakenly believes that a target lacks Fourth Amendment rights? How does the law apply to monitoring of communications between those who have and those who lack Fourth Amendment rights? The second category of problems follows from different standards of reasonableness that apply outside the United States and at the international border. Does the border search exception apply to purely electronic transmission? And if reasonableness varies by location, is the relevant location the search, the seizure, or the physical person?  
The article explores and answers each of these questions through the lens of equilibrium-adjustment. Today’s Fourth Amendment doctrine is heavily territorial. The article aims to adapt existing principles for the transition from a domestic physical environment to a global networked world in ways that maintain the preexisting balance of Fourth Amendment protection. On the first question, it rejects online contacts as a basis for Fourth Amendment protection; allows monitoring when the government wrongly but reasonably believes that a target lacks Fourth Amendment rights; and limits monitoring between those who have and those who lack Fourth Amendment rights. On the second question, it contends that the border search exception should not apply to electronic transmission and that reasonableness should follow the location of data seizure. The Internet requires search and seizure law to account for the new facts of international investigations. The solutions offered in this article offer a set of Fourth Amendment rules tailored to the reality of global computer networks.
Marthews, Alex and Tucker, Catherine, Government Surveillance and Internet Search Behavior (March 24, 2014) (SSRN) 
This paper uses data from Google Trends on search terms from before and after the surveillance revelations of June 2013 to analyze whether Google users' search behavior shifted as a result of an exogenous shock in information about how closely their internet searches were being monitored by the U. S. government. We use data from Google Trends on search volume for 282 search terms across eleven different countries. These search terms were independently rated for their degree of privacy-sensitivity along multiple dimensions. Using panel data, our result suggest that cross-nationally, users were less likely to search using search terms that they believed might get them in trouble with the U. S. government. In the U. S., this was the main subset of search terms that were affected. However, internationally there was also a drop in traffic for search terms that were rated as personally sensitive. These results have implications for policy makers in terms of understanding the actual effects on search behavior of disclosures relating to the scale of government surveillance on the Internet and their potential effects on international competitiveness. 
Hollis, Duncan B., Re-Thinking the Boundaries of Law in Cyberspace: A Duty to Hack? (April 12, 2014). in Cyberwar: Law & Ethics for Virtual Conflicts (J. Ohlin et al., eds., Oxford University Press, 2014 Forthcoming) (SSRN)
Warfare and boundaries have a symbiotic relationship. Whether as its cause or effect, States historically used war to delineate the borders that divided them. Laws and borders have a similar relationship. Sometimes laws are the product of borders as when national boundaries delineate the reach of States’ authorities. But borders may also be the product of law; laws regularly draw lines between permitted and prohibited conduct or bound off required acts from permissible ones. Both logics are on display in debates over international law in cyberspace. Some characterize cyberspace as a unique, self-governing ‘space’ that requires its own borders and the drawing of tailor-made rules therein. For others, cyberspace is merely a technological medium that States can govern via traditional territorial borders with rules drawn ‘by analogy’ from pre-existing legal regimes.  
This chapter critiques current formulations drawing law from boundaries and boundaries from law in cyberspace with respect to (a) its governance; (b) the use of force; and (c) international humanitarian law (IHL). In each area, I identify theoretical problems that exist in the absence of any uniform theory for why cyberspace needs boundaries. At the same time, I elaborate functional problems with existing boundary claims – particularly by analogy – in terms of their (i) accuracy, (ii) effectiveness and (iii) completeness. These prevailing difficulties on whether, where, and why borders are needed in cyberspace suggests the time is ripe for re-appraising the landscape.  
This chapter seeks to launch such a re-thinking project by proposing a new rule of IHL – a Duty to Hack. The Duty to Hack would require States to use cyber-operations in their military operations whenever they are the least harmful means available for achieving military objectives. Thus, if a State can achieve the same military objective by bombing a factory or using a cyber-operation to take it off-line temporarily, the Duty to Hack requires that State to pursue the latter course. Although novel, I submit the Duty to Hack more accurately and effectively accounts for IHL’s fundamental principles and cyberspace’s unique attributes than existing efforts to foist legal boundaries upon State cyber-operations by analogy. Moreover, adopting the Duty to Hack could constitute a necessary first step to resolving the larger theoretical and functional challenges currently associated with law’s boundaries in cyberspace.
Stopczynski, Arkadiusz and Greenwood, Dazza and Hansen, Lars Kai and Pentland, Alex, Privacy for Personal Neuroinformatics (April 21, 2014) (SSRN)
Human brain activity collected in the form of Electroencephalography (EEG), even with low number of sensors, is an extremely rich signal raising legal and policy issues. Traces collected from multiple channels and with high sampling rates capture many important aspects of participants' brain activity and can be used as a unique personal identifier. The motivation for sharing EEG signals is significant, as a mean to understand the relation between brain activity and well-being, or for communication with medical services. As the equipment for such data collection becomes more available and widely used, the opportunities for using the data are growing; at the same time however inherent privacy risks are mounting. The same raw EEG signal can be used for example to diagnose mental diseases, find traces of epilepsy, and decode personality traits. The current practice of the informed consent of the participants for the use of the data either prevents reuse of the raw signal or does not truly respect participants' right to privacy by reusing the same raw data for purposes much different than originally consented to. Here we propose an integration of a personal neuroinformatics system, Smartphone Brain Scanner, with a general privacy framework openPDS. We show how raw high-dimensionality data can be collected on a mobile device, uploaded to a server, and subsequently operated on and accessed by applications or researchers, without disclosing the raw signal. Those extracted features of the raw signal, called answers, are of significantly lower-dimensionality, and provide the full utility of the data in given context, without the risk of disclosing sensitive raw signal. Such architecture significantly mitigates a very serious privacy risk related to raw EEG recordings floating around and being used and reused for various purposes.
Reeves, Shane R. and Johnson, William J., Autonomous Weapons: Are You Sure These are Killer Robots? Can We Talk About It? (April 30, 2014). The Army Lawyer 1 (April 2014) (SSRN)
The rise of autonomous weapons is creating understandable concern for the international community as it is impossible to predict exactly what will happen with the technology. This uncertainty has led some to advocate for a preemptive ban on the technology. Yet the emergence of a new means of warfare is not a unique phenomenon and is assumed within the Law of Armed Conflict. Past attempts at prohibiting emerging technologies use as weapons — such as aerial balloons in Declaration IV of the 1899 Hague Convention — have failed as a prohibitive regime denies the realities of warfare. Further, those exploring the idea of autonomous weapons are sensitive not only to their legal obligations, but also to the various ethical and moral questions surrounding the technology. Rather than attempting to preemptively ban autonomous weapons before understanding the technology’s potential, efforts should be made to pool the collective intellectual resources of scholars and practitioners to develop a road forward. Perhaps this would be the first step to a more comprehensive and assertive approach to addressing the other pressing issues of modern warfare.
Timothy C. MacDonnell, Justice Scalia’s Fourth Amendment: Text, Context, Clarity, And Occasional Faint-Hearted Originalism (SelectedWorks) (2014)
Since joining the United States Supreme Court in 1986, Justice Scalia has been one of the most prominent voices on the Fourth Amendment, having written twenty majority opinions, twelve concurrences and eight dissents on the topic. Justice Scalia’s Fourth Amendment opinions have had a significant effect on the Court’s jurisprudence relative to the Fourth Amendment. Under his pen, the Court has altered its test for determining when the Fourth Amendment should apply; provided a vision for how technology’s encroachment on privacy should be addressed; and articulated the standard for determining whether government officials are entitled to qualified immunity in civil suits involving alleged Fourth Amendment violations. In most of Justice Scalia’s opinions, he has championed the originalist/textualist theory of constitutional interpretation. Based on that theory, he has advocated that the text and context of the Fourth Amendment should govern how the Court interprets most questions of search and seizure law. His Fourth Amendment opinions have also included an emphasis on clear, bright-line rules that can be applied broadly to Fourth Amendment questions. However, there are Fourth Amendment opinions in which Justice Scalia has strayed from these commitments; particularly in the areas of the special needs doctrine and qualified immunity. The article asserts that Justice Scalia’s non-originalist approach in these spheres threatens the cohesiveness of his Fourth Amendment jurisprudence, and could, if not corrected, unbalance the Fourth Amendment in favor of law enforcement interests.

Thursday, April 24, 2014

Must Read Law Review Article -- Personal Curtilage: Fourth Amendment Security in Public

Andrew Guthrie Ferguson has a new law review article in the April 2014 issue (Vol. 55, No. 4) of William & Mary Law Review, entitled: Personal Curtilage: Fourth Amendment Security in Public. The abstract is below:
Do citizens have any Fourth Amendment protection from sense-enhancing surveillance technologies in public? This Article engages a timely question as new surveillance technologies have redefined expectations of privacy in public spaces. It proposes a new theory of Fourth Amendment security based on the ancient theory of curtilage protection for private property. Curtilage has long been understood as a legal fiction that expands the protection of the home beyond the formal structures of the house. Based on custom and law protecting against both nosy neighbors and the government, curtilage was defined by the actions the property owner took to signal a protected space. In simple terms, by building a wall around one's house, the property owner marked out an area of private control. So, too, the theory of personal curtilage turns on persons being able to control the protected areas of their lives in public by similarly signifying that an area is meant to be secure from others. 
This Article develops a theory of personal curtilage built on four overlapping foundational principles. First, persons can build a constitutionally protected space secure from governmental surveillance in public. Second, to claim this space as secure from governmental surveillance, the person must affirmatively mark that space in some symbolic manner. Third, these spaces must be related to areas of personal autonomy or intimate connection, be it personal, familial, or associational. Fourth, these contested spaces-like traditional curtilage-will be evaluated by objectively balancing these factors to determine if a Fourth Amendment search has occurred. Adapting the framework of traditional trespass, an intrusion by sense-enhancing technologies into this protected personal curtilage would be a search for Fourth Amendment purposes. The Article concludes that the theory of personal curtilage improves and clarifies the existing Fourth Amendment doctrine and offers a new framework for future cases. It also highlights the need for a new vision of trespass to address omnipresent sense-enhancing surveillance technologies.

Tuesday, February 25, 2014

Featured Paper: Bridging the Cellular Divide: A Search for Consensus Regarding Law Enforcement Access to Historical Cell Data

From the February 2014 Issue of the Cardozo Law Review:

Zachary Ross, Bridging the Cellular Divide: A Search for Consensus Regarding Law Enforcement Access to Historical Cell Data.

Excerpt:
Technological change is often a double-edged sword--it enables and enriches our lives, but also allows for new means of exploitation and control. As social, architectural, and market barriers protecting longstanding notions of personal space erode, individuals increasingly rely on the legal system as a defense to arbitrary invasions of privacy. Paradoxically, the same forces that make the need for robust privacy protections more compelling also make the existing legal framework outdated and inapposite. 
These contradictions are readily apparent in the contemporary debate over the legal restrictions on government access to cell site location information (CSLI). This data, constantly collected by cell phone service providers (CSPs) in order to manage their networks, has the potential to provide a detailed map of an individual cell user's movements from place to place over extended periods of time. Furthermore, the quantity and precision of location data collected by CSPs is constantly increasing, becoming more revealing, and more valuable to law enforcement in the process. Despite the potential intimacy of this data and its growing relevance to criminal investigations, the legal protection afforded CSLI is hotly disputed, and at present varies greatly among (and sometimes even within) jurisdictions-- with courts sometimes requiring a warrant, and sometimes allowing unfettered access upon a lesser evidentiary showing. This lack of uniformity has been exacerbated by a recent Fifth Circuit ruling on government access to CSLI, which generated a different rule than had previously been adopted by the Third Circuit. The vastly disparate treatment of government requests for CSLI has created a chaotic system ripe for abuse, and all but guaranteed Supreme Court review of the issue in the near future, as the Court itself seems to have implicitly acknowledged. 
This Note will examine the complex interaction between privacy, surveillance, and technology through an exploration of the contested legal terrain governing law enforcement access to historical CSLI--location data recorded by CSPs which reveal an individual's past movements. 

Tuesday, February 4, 2014

Massive round-up of new law articles, covering privacy, Fourth Amendment, GPS, cell site, cybercrime, big data, revenge porn, drones, and more

This Article examines a question that has become increasingly important in the emerging surveillance society: Should the law treat information as private even though others know about it? This is the third-party privacy problem. Part II explores two competing conceptions of privacy — the binary and contextual conceptions. Part III describes two features of the emerging surveillance society that should change the way we address the third-party privacy problem. One feature, “surveillance on demand,” results from exponential increases in data collection and aggregation. The other feature, “uploaded lives,” reflects a revolution in the type and amount of information that we share digitally. Part IV argues that the binary conception cannot protect privacy in the surveillance society because it fails to account for the new realities of surveillance on demand and uploaded lives. Finally, Part V illustrates how courts and legislators can implement the contextual conception to deal with two emerging surveillance society problems — facial recognition technology and geolocation data.

Privacy laws rely on the unexamined assumption that the collection of data is not speech. That assumption is incorrect. Privacy scholars, recognizing an imminent clash between this long-held assumption and First Amendment protections of information, argue that data is different from the sort of speech the Constitution intended to protect. But they fail to articulate a meaningful distinction between data and other more traditional forms of expression. Meanwhile, First Amendment scholars have not paid sufficient attention to new technologies that automatically capture data. These technologies reopen challenging questions about what “speech” is. 
This Article makes two overdue contributions to the First Amendment literature. First, it argues that when the scope of First Amendment coverage is ambiguous, courts should analyze the government’s motive for regulating. Second, it highlights and strengthens the strands of First Amendment theory that protect the right to create knowledge. Whenever the state regulates in order to interfere with the creation of knowledge, that regulation should draw First Amendment scrutiny. 
In combination, these claims show clearly why data must receive First Amendment protection. When the collection or distribution of data troubles lawmakers, it does so because data has the potential to inform and to inspire new opinions. Data privacy laws regulate minds, not technology. Thus, for all practical purposes, and in every context relevant to privacy debates, data is speech.
The police tend to think that those who evade surveillance are criminals. Yet the evasion may only be a protest against the surveillance itself. Faced with the growing surveillance capacities of the government, some people object. They buy “burners” (prepaid phones) or “freedom phones” from Asia that have had all tracking devices removed, or they hide their smartphones in ad hoc Faraday cages that block their signals. They use Tor to surf the internet. They identify tracking devices with GPS detectors. They avoid credit cards and choose cash, prepaid debit cards, or bitcoins. They burn their garbage. At the extreme end, some “live off the grid” and cut off all contact with the modern world. 
These are all examples of what I call privacy protests: actions individuals take to block or to thwart government surveillance for reasons unrelated to criminal wrongdoing. Those engaged in privacy protests do so primarily because they object to the presence of perceived or potential government surveillance in their lives. How do we tell the difference between privacy protests and criminal evasions, and why does it matter? Surprisingly scant attention has been given to these questions, in part because Fourth Amendment law makes little distinction between ordinary criminal evasions and privacy protests. This Article discusses the importance of these ordinary acts of resistance, their place in constitutional criminal procedure, and their potential social value in the struggle over the meaning of privacy.
Conor M. Reardon, Cell Phones, Police Recording, and the Intersection of the First and Fourth Amendments, 63 Duke Law Journal 735-779 (2013). Abstract:
In a recent spate of highly publicized incidents, citizens have used cell phones equipped with video cameras to record violent arrests. Oftentimes they post their recordings on the Internet for public examination. As the courts have recognized, this behavior lies close to the heart of the First Amendment. 
But the Constitution imperfectly protects this new form of government monitoring. Fourth Amendment doctrine generally permits the warrantless seizure of cell phones used to record violent arrests, on the theory that the recording contains evidence of a crime. The Fourth Amendment inquiry does not evaluate a seizing officer’s state of mind, permitting an official to seize a video for the very purpose of suppressing its contents. Moreover, Supreme Court precedent is typically read to ignore First Amendment interests implicated by searches and seizures. 
This result is perverse. Courts evaluating these seizures should stop to recall the Fourth Amendment’s origins as a procedural safeguard for expressive interests. They should remember, too, the Supreme Court’s jurisprudence surrounding seizures of obscene materials—an area in which the Court carefully shaped Fourth Amendment doctrine to protect First Amendment values. Otherwise reasonable seizures can become unreasonable when they threaten free expression, and seizures of cell phones used to record violent arrests are of that stripe. Courts should therefore disallow this breed of seizure, trusting the political branches to craft a substitute procedure that will protect law-enforcement interests without doing violence to First Amendment freedoms.
Elizabeth Friedler, Protecting the Innocent—the Need to Adapt Federal Asset Forfeiture Laws to Protect the Interests of Third Parties in Digital Asset Seizures, Cardozo Arts & Entertainment Law Journal, Volume 32, Issue 1 (2013).

Jana Sutton, Of Information, Trust, and Ice Cream: A Recipe for a Different Perspective on the Privacy of Health Information, 55 Ariz. L. Rev. 1171 (2014). Abstract:
The concept of privacy is inescapable in modern society. As technology develops rapidly and online connections become an integral part of our daily routines, the lines between what may or may not be acceptable continue to blur. Individual autonomy is important. We cannot, however, allow it to suffocate the advancement of technology in such vital areas as public health. Although this Note cannot lay out detailed instructions to balance the desire for autonomy and the benefits of free information, it attempts to provide some perspective on whether we are anywhere close to striking the right balance. When the benefits of health information technology are so glaring, and yet its progress has been so stifled, perhaps we have placed far too much value—at least in the health care context—on individual privacy.
Kevin S. Bankston & Ashkan Soltani, Tiny Constables and the Cost of Surveillance: Making Cents Out of United States v. Jones, 123 YALE L.J. ONLINE 335 (2014). Abstract:
In United States v. Jones, five Supreme Court Justices wrote that government surveillance of one’s public movements for twenty-eight days using a GPS device violated a reasonable expectation of privacy and constituted a Fourth Amendment search. Unfortunately, they didn’t provide a clear and administrable rule that could be applied in other government surveillance cases. In this Essay, Kevin Bankston and Ashkan Soltani draw together threads from the Jones concurrences and existing legal scholarship and combine them with data about the costs of different location tracking techniques to articulate a cost-based conception of the expectation of privacy that both supports and is supported by the concurring opinions in Jones.
Schmitt, Michael N. and Vihul, Liis, The International Law of Attribution During Proxy 'Wars' in Cyberspace (January 30, 2014). 1 Fletcher Security Review (2014 Forthcoming). Abstract:
The article examines the use of non-State actors by States to conduct cyber operations against other States. In doing so, it examines attribution of a non-State actor's cyber operations to a State pursuant to the law of State responsibility, attribution of a non-State actor's cyber armed attack to a State for the purposes of a self-defense analysis, and attribution of cyber military operations to a State in the context of determining whether an international armed conflict has been initiated. These three very different legal inquiries are often confused with each other. The article seeks to deconstruct the issue of attribution into its various normative components.
Kate Crawford & Jason Schultz, Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms, 55 B.C. L. Rev. 93 (2014). Abstract:
The rise of “Big Data” analytics in the private sector poses new challenges for privacy advocates. Through its reliance on existing data and predictive analysis to create detailed individual profiles, Big Data has exploded the scope of personally identifiable information (“PII”). It has also effectively marginalized regulatory schema by evading current privacy protections with its novel methodology. Furthermore, poor execution of Big Data methodology may create additional harms by rendering inaccurate profiles that nonetheless impact an individual’s life and livelihood. To respond to Big Data’s evolving practices, this Article examines several existing privacy regimes and explains why these approaches inadequately address current Big Data challenges. This Article then proposes a new approach to mitigating predictive privacy harms—that of a right to procedural data due process. Although current privacy regimes offer limited nominal due process-like mechanisms, a more rigorous framework is needed to address their shortcomings. By examining due process’s role in the Anglo-American legal system and building on previous scholarship about due process for public administrative computer systems, this Article argues that individuals affected by Big Data should have similar rights to those in the legal system with respect to how their personal data is used in such adjudications. Using these principles, this Article analogizes a system of regulation that would provide such rights against private Big Data actors.
Larkin, Paul J., 'Revenge Porn,' State Law, and Free Speech (January 14, 2014).  Abstract:
For most of our history, only celebrities — presidents, movie stars, professional athletes, and the like — were at risk of having their everyday exploits and activities photographed and shown to the world. But that day is gone. Today, we all face the risk of being made into a celebrity due to the ubiquity of camera-equipped cell phones and the ease of uploading photographs or videos onto the Internet. But a particularly troubling aspect of this phenomenon goes by the name of "revenge porn" — that is, the Internet posting of photographs of naked former wives and girlfriends, sometimes in intimate positions or activities. Revenge porn is an example of malicious conduct that injures the welfare of someone who mistakenly trusted an intimate partner. Tort law traditionally has allowed parties to recover damages for such violations of privacy, and criminal law also can prohibit such conduct, but there are several First Amendment defenses that the responsible parties can assert to fend off liability. This article argues that allowing a victim of revenge porn to recover damages for publication that breaches an implicit promise of confidentiality is faithful to tort and criminal law principles and will not punish or chill the legitimate expression of free speech.
Jonathan Olivito, Beyond the Fourth Amendment: Limiting Drone Surveillance Through the Constitutional Right to Informational Privacy, 74 Ohio St. L.J. 669 (2013). 

The entirety of Volume 74, Issue 6 in the Ohio State Law Journal; Symposium: The Second Wave of Global Privacy Protection (Titles Below)
Peter Swire, The Second Wave of Global Privacy Protection: Symposium Introduction, 74 Ohio St. L.J. 841 (2013). 
Ann Bartow, Privacy Laws and Privacy Levers: Online Surveillance Versus Economic Development in the People’s Republic of China, 74 Ohio St. L.J. 853 (2013). 
Andrew Clearwater & J. Trevor Hughes, In the Beginning . . . An Early History of the Privacy Profession, 74 Ohio St. L.J. 897 (2013). 
Claudia Diaz, Omer Tene & Seda Gürses, Hero or Villain: The Data Controller in Privacy Law and Technologies, 74 Ohio St. L.J. 923 (2013). 
A. Michael Froomkin, “PETs Must Be on a Leash”: How U.S. Law (and Industry Practice) Often Undermines and Even Forbids Valuable Privacy Enhancing Technology, 74 Ohio St. L.J. 965 (2013). 
Woodrow Hartzog, Social Data, 74 Ohio St. L.J. 995 (2013). 
Dennis D. Hirsch, In Search of the Holy Grail: Achieving Global Privacy Rules Through Sector-Based Codes of Conduct, 74 Ohio St. L.J. 1029 (2013). 
Gus Hosein & Caroline Wilson Palow, Modern Safeguards for Modern Surveillance: An Analysis of Innovations in Communications Surveillance Techniques, 74 Ohio St. L.J. 1071 (2013). 
Anil Kalhan, Immigration Policing and Federalism Through the Lens of Technology, Surveillance, and Privacy, 74 Ohio St. L.J. 1105 (2013). 
Bartosz M. Marcinkowski, Privacy Paradox(es): In Search of a Transatlantic Data Protection Standard, 74 Ohio St. L.J. 1167 (2013). 
Thomas Margoni & Mark Perry, Deep Pockets, Packets, and Harbors, 74 Ohio St. L.J. 1195 (2013). 
Omer Tene, Privacy Law’s Midlife Crisis: A Critical Assessment of the Second Wave of Global Privacy Laws, 74 Ohio St. L.J. 1217 (2013). 
Yofi Tirosh & Michael Birnhack, Naked in Front of the Machine: Does Airport Scanning Violate Privacy? 74 Ohio St. L.J. 1263 (2013). 
Yang Wang, Pedro Giovanni Leon, Xiaoxuan Chen, Saranga Komanduri, Gregory Norcie, Kevin Scott, Alessandro Acquisti, Lorrie Faith Cranor & Norman Sadeh, From Facebook Regrets to Facebook Privacy Nudges, 74 Ohio St. L.J. 1307 (2013). 
Tal Z. Zarsky & Norberto Nuno Gomes de Andrade, Regulating Electronic Identity Intermediaries: The “Soft eID” Conundrum, 74 Ohio St. L.J. 1335 (2013).
The entirety of Volume 14, Issue 1 of the  Journal of High Technology Law (2014) (Titles Below).
After Jones, The Deluge: The Fourth Amendment's Treatment Of Information, Big Data And The Cloud , Lon A. Berk, 14 J. High Tech L. 1 (2014). 
The Legislative Response To Employers' Requests For Password Disclosure, Jordan M. Blanke, 14 J. High Tech L. 42 (2014). 
A Shot In The Dark: An Analysis Of The SEC's Response To The Rise Of Dark Pools Edwin Batista, 14 J. High Tech L. 83 (2014). 
Privacy Protections Left Wanting: Looking At Doctrine And Safeguards On Law Enforcements' Use Of GPS Tracking And Cell Phone Records With A Focus On Massachusetts, Lloyd Chebaclo, 14 J. High Tech L. 120 (2014).

Sunday, October 27, 2013

Featured Paper: The Legislative Response to Mass Police Surveillance

Stephen Rushin has a forthcoming paper in the Brooklyn Law Review entitled: The Legislative Response to Mass Police Surveillance.

The abstract is below:
Police departments have rapidly adopted mass surveillance technologies in an effort to fight crime and improve efficiency. I have previously described this phenomenon as the growth of the digitally efficient investigative state. This new technological order transforms traditional law enforcement by improving the efficiency of everyday policing activities and retaining copious amounts of data on both suspicious and unsuspicious behavior. Empirical evidence shows that police surveillance technologies are common and rapidly expanding in urban America. In the absence of legislative action, police departments have adopted widely disparate internal policies. The Supreme Court had the opportunity to reign in the scope of police surveillance in Jones v. United States. But the Court could not agree on whether technological improvements in efficiency transform an otherwise legal policing tactic into an unconstitutional search. Nor could the Court agree on whether a person may have a reasonable expectation to privacy in public movement. Post-Jones, the jurisprudence of police surveillance emerged as incoherent as ever.  
I have previously argued that the judiciary should regulate police surveillance technologies. While it remains possible that the judiciary will someday make such a doctrinal shift, the immediate responsibility for regulating police surveillance technology falls on state legislatures. In this Article, I offer a model statute to regulate mass police surveillance. The model statute limits indiscriminate data collection. It also caps data retention for personally identifiable information. It excludes from criminal court any locational evidence obtained in violation of the statute. And it gives the state attorney general authority to bring suit against police departments that fail to abide by the law. This legislation would give discretion to police departments to craft data policies fitting their city’s unique needs, while also encouraging consistency and fairness.

Saturday, March 2, 2013

Featured Paper: Domestic Drone Use and the Mosaic Theory

From Sean Sullivan at UNM Law School. The article can be found here: Domestic Drone Use and the Mosaic Theory.

Abstract:

The use of unmanned aerial drones - operated by remote pilots and capable of conducting pinpoint strikes on targets around the world - has revolutionized the fight against terrorism. Within the past few years, however, drones have also been used for domestic security and law enforcement purposes, and such local use is likely to expand in the near future. Whether the government’s use of emerging, sophisticated technologies comports with the 4th Amendment’s protection against unreasonable searches and seizures has confounded the courts, and there are growing concerns that traditional 4th Amendment analyses are no longer workable in the context of modern technologies. In U.S. v. Jones (2011), the Supreme Court applied a relatively new doctrine, the “mosaic theory,” in determining whether the government’s use of technology, in this case a G.P.S. tracking system, was consistent with fundamental 4th Amendment protections. 

This paper explores whether the “mosaic theory,” laid out by legal scholar Orin Kerr and espoused by the Court in Jones, can be applicable to 4th Amendment challenges to domestic drone use. This paper first explains the extent to which drones are already operational domestically, and briefly discusses proposals to expand their domestic capabilities; second, provides a brief overview of the traditional 4th Amendment analyses in the realm of emerging technologies, with an eye toward determining whether the “property-driven” or “reasonable expectation of privacy” doctrines are no longer applicable to such sophisticated technologies; third, discusses the Jones case as well as the “mosaic theory” in order to provide a solid foundation from which to draw conclusions about its applicability to domestic drone use; and fourth, analyzes a particular type of domestic drone use under the “mosaic theory” rubric, and determines whether it is an appropriate framework to ensure 4th Amendment protections in the context of emerging technologies going forward. 

The domestic uses of drones are increasing and have been largely overlooked by the public. At the same time, the courts are struggling with how to check such use against the constitutional right to be free from unreasonable searches and seizures. An appropriate analytical framework is needed to assist the courts in ensuring that the government’s domestic use of drones does not infringe on the people’s well-established civil liberties before drones become an even more ubiquitous part of the domestic American experience or facilitate the creation of a perpetual “nanny state” under the guise of providing national security.

Tuesday, August 14, 2012

EFF files amicus in D.C. Circuit Court against use of CSLI in remanded Jones case

Back in April, Jeffrey wrote that Antoine Jones wasn't off the hook for his crimes because of the ruling in United States v. Jones, 132 S. Ct. 945 (2012). Rather, instead of using the GPS tracking data they had collected (illegally), the police decided to use Cell Site Location Information (CSLI). Jeffrey's previous article can be found here - Jones II: This time, the government seeks to use cell site location information.  If you're looking to read more on the subject, we have additional content that can be found, here.

On Monday, the Electronic Frontier Foundation filed an amicus brief in favor of Antoine Jones, arguing that six months worth of CSLI should not be obtainable without a warrant. The EFF drew parallels between this situation and the GPS tracking that occurred in the original instance. Additionally, the EFF forwards an argument in the brief that could not be used in the context of GPS tracking - that CSLI could actually provide information about occurrences inside the home. This is important because courts have tended to give the most Fourth Amendment protection to the confines of a private home - see, for example, Karo or Kyllo.

The EFF's brief also addresses third-party doctrine, the Stored Communications Act, and even CALEA.

The brief can be found here: BRIEF AMICI CURIAE OF THE ELECTRONIC FRONTIER FOUNDATION AND CENTER FOR DEMOCRACY & TECHNOLOGY IN SUPPORT OF DEFENDANT ANTOINE JONES’ MOTION TO SUPPRESS CELL SITE DATA


The EFF also has a story, here: Government Faces New Warrantless Surveillance Battle After Losing Landmark GPS Tracking Case